237 Data Steward jobs in Singapore
Chief Data Steward
Posted today
Job Viewed
Job Description
With a focus on enhancing data-driven decision-making, this role involves developing and implementing strategies for effective data governance.
About the Position- Collaborate with cross-functional teams to design and implement robust data management frameworks.
- Develop and maintain data quality standards, ensuring accuracy and consistency across organizational systems.
- A bachelor's degree in computer science or a related field, complemented by relevant professional experience (typically 2-5 years) in data governance or data quality roles.
- Proficiency in programming languages such as Python and SQL, along with expertise in data analysis and visualization tools.
- Strong communication skills, enabling seamless collaboration with stakeholders at all levels of the organization.
- Demonstrated ability to analyze complex data-related problems and develop innovative solutions.
As a key member of the data team, you will be responsible for fostering a culture of data-driven insights within the organization. This includes collaborating with business partners to identify areas for process improvement, developing and maintaining metadata repositories, and providing expert guidance on data quality and governance best practices.
Data Integrity Engineer
Posted today
Job Viewed
Job Description
The ideal candidate will be responsible for providing quality oversight in computerized system life cycle procedure and ensuring compliance to GXP standards and company policies and procedures. As a Subject Matter Expert (SME) in CSV topics, you will provide guidance and support to establish systems for ensuring Data integrity, compliance to CSV Plan and 21CFR Part 11.
Responsibilities:
- Work with a cross-functional team to ensure Quality oversight in computerized system life cycle procedure and adherence to GXP standards and company policies and procedures.
- Act as one of the key focal points for CSV topics such as deviations, change management, investigations, CAPA identification and closure and as a Subject Matter Expert.
- Generate, review and execute protocols/test cases for initial validation programs related to GXP Computer systems.
- Review and approve qualification documentations like URS, SLIA/CLIA, SRA/DIRA/DSA, DQ, SIOQ, HIOQ, FAT/FATSR, Traceability Matrix etc.
- Ensure adherence to CSV master plans and execution plans for GXP computer systems like DCS, PLC, BMS, MES, eBR, Lab Information systems, Environmental monitoring systems and other Business IT systems like Maximo, Network and ERP systems that are part of the GMP envelope in a Biologics manufacturing facility.
- Perform any other tasks assigned by Line Manager.
Requirements:
- Minimum bachelor's degree or higher in Science or Engineering or equivalent with at least 3 years of relevant work experience.
- Minimum 1 year experience with Emerson DeltaV system – software coding review.
- Working experience and knowledge on CSV of start-ups and brownfield project experience in both Operation Technology / Information Technology system is a plus.
- Hands-on experience in Validation life cycle of computer systems is a must.
- Experience in Siemens PLC system is a plus.
- Good communication skills.
- Excellent team player willing to work for the common goal.
- Knowledge of pharmaceutical regulatory requirements (GMP) is essential.
- Shows a high level of tenacity to ensure closure of issues.
- Largely self-managed with ability to communicate upwards and cross functionally to ensure all key project milestones are met.
Data Quality Specialist
Posted today
Job Viewed
Job Description
We are seeking an experienced Gen AI Accuracy Analyst to contribute to our organization's success.
Job DescriptionThe selected candidate will be responsible for ensuring the quality and accuracy of Gen AI use cases through comprehensive monitoring and analysis.
- Monitor and analyze Gen AI use cases to ensure accuracy and quality
- Develop and implement monitoring processes to track key metrics for Gen AI use cases
- Generate regular reports on Gen AI performance and accuracy
- Liaise with stakeholders to provide clear and insightful reports on Gen AI performance
To excel in this role, you should have a strong background in data quality, artificial intelligence, and analytics. You must also possess:
- Bachelor's degree in Computer Science, Data Science, or related field
- Excellent communication and stakeholder management skills
- Ability to work in a fast-paced environment
In return for your expertise, we offer a competitive salary and benefits package, ongoing training and development opportunities, and a collaborative and dynamic work environment.
Data Quality Specialist
Posted today
Job Viewed
Job Description
Job Overview
">The role of an Instrument Data Analyst is a key position within the firm's Reference Data Operations team. This team plays a vital part in ensuring the quality, accuracy, and timeliness of data. As an analyst, you will work closely with professionals across various departments to manage data quality and exceptions for reference data.
">Responsibilities include:
">- Supporting instrument reference data to ensure consistent delivery of high-quality data on time;
- Managing corporate action events related to instrument reference data across all asset classes, understanding the impact of market events on static data and triggering events;
- Providing subject matter expertise support to downstream teams and clients;
- Ensuring proper controls are in place to reduce financial risks, implementing procedures to enforce controls, and ensuring they are adhered to;
- Partnering and collaborating with other regions is required;
- Resolving client queries promptly and professionally;
- Contributing to ongoing developments and refinements of operational processes;
Requirements:
">- At least a Bachelor's degree in Finance, Science, or Technology, or equivalent;
- Open to fresh graduates;
- Strong knowledge of market data vendors like Bloomberg and Refinitiv;
- Knowledge of industry activities, corporate actions, index tracking, and capital markets;
- Inquisitive and analytical mindset, challenges the status quo, strives for excellence while paying attention to risk involved;
- Ability to work independently and multitask in a fast-paced environment;
- Demonstrates ability to communicate effectively within a team and across departments;
- Strong analytical and problem-solving skills with attention to details to formulate effective solutions to address pain points and improve efficiency;
- As the team covers the Asia Pacific region, staggered shift work and working on public holidays is required;
Preferred qualifications, capabilities, and skills include prior experience in Financial Services, digital tools like Alteryx, JIRA, and Confluence, and good understanding of data management principles for financial services - governance, quality, storage, modeling.
Data Quality Manager
Posted 1 day ago
Job Viewed
Job Description
In this role, you will lead the design and implementation of data quality rules and controls, focusing on Risk, Finance, Compliance, and Regulatory Reporting domains. You will be responsible for driving the operationalization of data quality management using tools, ensuring compliance with banking regulatory standards like BCBS 239.
Key Responsibilities
- Implement data governance policies, standards, and procedures aligned with relevant regulatory frameworks such as BCBS 239.
- Work collaboratively with Risk, Finance, Compliance, and Regulatory Reporting teams to identify and address data quality gaps.
- Conduct training sessions on data quality policies and the use of data quality tools.
- Design and deploy functional data quality rules covering domains including risk metrics, finance (e.g., general ledger controls, reconciliation), compliance, and regulatory reporting.
- Partner with various teams to implement data quality rules that support key metrics for regulatory reports and specific business use cases.
- Ensure data quality and lineage for risk aggregation and reporting according to regulatory requirements.
- Coordinate with teams managing regulatory reporting systems to integrate data quality results into Collibra.
- Oversee Collibra DQ Issue Management workflows including root cause analysis, remediation plans, and closure tracking.
- Integrate Collibra with platforms such as Azure Data Lake, Alteryx, and regulatory reporting tools to capture and display data quality results.
- Develop dashboards within Collibra for tracking data quality scores, regulatory compliance, and exceptions.
- Configure and maintain Collibra data quality rules, workflows, and scorecards, aligning them with business glossaries and data dictionaries.
- Build end-to-end data lineage views within Collibra to connect regulatory reports back to source systems.
Job Requirements
- Bachelor’s degree in Computer Science, Finance, Data Management, or a related field.
- Relevant professional certifications such as CDMP, Collibra Ranger, DGSP, or CIMP are preferred.
- 2 to 8 years of experience in data quality, data governance, or data management roles within the financial services or banking sector.
- Strong knowledge of banking regulatory frameworks including BCBS 239 and other regional reporting requirements in APAC.
- Demonstrated experience designing data quality rules across risk, finance, compliance, and regulatory reporting domains.
- Experience collaborating with Risk, Finance, Regulatory Reporting, and Audit teams to identify and resolve data quality issues.
- Proven hands-on experience with the Collibra Data Intelligence Platform, including expertise in developing custom and out-of-the-box DQ rules, managing Collibra workflows for issue tracking, and handling metadata and lineage would be an advantage.
- Familiarity with SQL, Python, or Alteryx to support data quality rule execution and integration.
- Proven experience integrating Collibra with Azure Data Platform, regulatory reporting solutions, and risk management tools.
- Strong analytical, documentation, and communication skills with a collaborative approach to working across teams.
Executive Data Quality Specialist
Posted today
Job Viewed
Job Description
The Organisation is seeking a detail-oriented candidate to assist in maintaining and improving data quality standards across HR systems.
This role involves conducting thorough reviews of datasets, verifying and correcting data where necessary, and performing regular quality checks on cleaned data.
Additionally, the successful candidate will support administrative tasks such as preparing reports and processing invoices.
Required Skills and Qualifications- A diploma holder with 3 years of strong hands-on experience in data cleaning and handling large datasets.
- Proficient in Microsoft Office Suite, especially advanced Excel functions.
- Meticulous attention to details and quality control.
- Experience in using SAP-based HR systems is advantageous.
This is a one-year contract position, requiring 42 hours of work per week.
Other InformationOnly shortlisted candidates will be notified. Please highlight your relevant skills and qualifications in your application.
Business Data Quality Specialist
Posted today
Job Viewed
Job Description
We are seeking a detail-oriented professional to help maintain and improve data quality standards, while supporting administrative functions.
">- Conduct thorough reviews of datasets and records across various systems and sources to identify issues including missing data, inconsistencies, duplicates, and errors.
- Verify and correct data where necessary and obtain further information for incomplete documents.
- Perform data entry into relevant systems of rectified data after verification.
- Collaborate with teams to understand data requirements and implement cleaning methodologies.
- Perform regular quality checks on cleaned data to ensure accuracy and completeness.
Requirements:
- A diploma holder with at least 3 years of experience in data cleaning and handling large datasets.
- Proficient in Microsoft Office Suite, especially advanced Excel functions.
- Meticulous attention to detail and quality control.
Be The First To Know
About the latest Data steward Jobs in Singapore !
Gen AI Data Quality Specialist
Posted today
Job Viewed
Job Description
Job Overview
">This role is responsible for guaranteeing the precision and accuracy of Gen AI use cases through comprehensive monitoring and analysis. ">
The ideal candidate will establish and improve scalable accuracy monitoring processes applicable to various Gen AI use cases, providing stakeholders with clear and insightful reports on Gen AI performance.
Key Responsibilities:
1. Monitor and analyze accuracy data to ensure data quality and relevance for performance evaluation.
2. Implement and refine monitoring processes to track key metrics for Gen AI use cases, enabling proactive identification of performance trends and potential issues.
3. Generate regular reports on the accuracy and performance of Gen AI use cases, highlighting trends and insights for stakeholders and various forums.
4. Contribute to the preparation of Gen AI test data, focusing on document samples.
Requirements:
* Strong project management skills
* Excellent interpersonal and leadership skills
* Proficient in Microsoft Office and Excel
* Experience in risk management and data quality
* Agile methodologies and project planning
* PMP certification a plus
Senior Code Designer (Data Quality)
Posted 11 days ago
Job Viewed
Job Description
· Should have minimum 10 years’ experience in relevant activities.
· Solution design using proven patterns, awareness of anti-patterns, performance tuning.
· Develop and maintain web applications using Java/J2EE, Spring, AngularJS, Spring MVC/Struts, Multi-threading, Restful web services, Swagger, JMS/WebSphere MQ, JavaScript, JQuery, XML, XSLT, XPath, XSD.
· A strong understanding of recent Java language features
· Design and implement user interfaces using ReactJS, AngularJS, TypeScript, and UI component libraries such as Material-UI.
· Collaborate with UI/UX designers to translate designs into high-quality code and ensure the technical feasibility of UI/UX designs.
· Optimize applications for maximum speed and scalability.
· Ensure reliable and scalable message processing using Kafka.
· Work with NoSQL databases like MongoDB, and experience in best practices for NoSQL DB performance.
· Work with relational databases – MSSQL, Oracle, PostgreSQL.
· Source management – SVN/GIT, TDD using Junit, DBUnit, Jira / QC.
· Application server – Jboss / WildFly / Websphere.
· Write well-designed, testable, efficient code.
· Well experienced and having a good understanding of SQL language.
· Well experienced and having a good understanding of Unix/Linux Shell Scripting.
· Experience with JIRA, Confluence, Maven, GitLab, Jenkins, SonarQube, and other deployment tools.
· Exposure to DevOps tools.
· Knowledge of implementing solutions on the Cloud, preferably AWS.
Qlik Dashboard Development / Data Quality Analyst (Cum Project Coordinator) (BANKING - Enterpris...
Posted 4 days ago
Job Viewed
Job Description
Bank Sector Client Singapore: Enterprise Data Governance | Data Management
- Summary: Work with stakeholders to understand business and functional requirements for data quality initiatives, using SQL queries for data extraction/table creation to ensure optimal data quality scores.
-Mandatory Skills: SQL (Query, Table Creation), Qlik Dashboard Visualization Development, Stakeholder management, strong presentation skills in PowerPoint, Project/time management experience.
- Good to have: Python Scripting & Automation
Job Description: Technical development work for upcoming Data Quality initiatives
- Collaborate with internal and external stakeholders to understand business needs and translate them into analytical solutions.
- Strategic thinker with a proactive approach to problem solving paired with strong stakeholder engagement skills
- Ability to story tell with data, articulating key points for consideration, as opposed to reading numbers off the dashboard
- Perform data profiling using SQL on large datasets to identify potential data quality issues.
- Develop and amend Qlik dashboard visualizations, in accordance to business requirements.
- Execute technical analysis with respect to data quality processes including requirement gathering, root cause analysis, dashboard wireframes and stakeholder engagement.
- Develop and perform SIT/UAT test cases for data quality rules and dashboard visualizations.
- Validation of technical implementation to ensure deliverables are fit for requirement with users.
- Strong written and presentation communication skills including dashboards and PowerPoint presentations
- Business process improvement and optimization including creation of SOP
- Develop and amend data quality rules using Python or Informatica IDQ, in accordance to business requirements (good to have)
===
About us:
D L Resources Pte Ltd is a leading provider of IT Professional Services & Banking outsourced staffing solutions, serving a diverse portfolio of clients across various industries including Financial Services Institutions, Banks & MNCs.
Interested candidates may reach out directly to our recruiters (Edwin: +65 8 8 3 3 0 1 9 2 | EA License No: 24C2333 | EA Personnel No: R24123520)