267 Data Storage jobs in Singapore
Junior Data Storage
Posted today
Job Viewed
Job Description
Line of Service
Internal Firm Services
Industry/Sector
Not Applicable
Specialism
IFS - Operations
Management Level
Intern/Trainee
Job Description & Summary
At PwC, we help clients build trust and reinvent so they can turn complexity into competitive advantage. We're a tech-forward, people-empowered network with more than 370,000 people in 149 countries. Across audit and assurance, tax and legal, deals and consulting we help clients build, accelerate and sustain momentum. Find out more at
Our Firmwide Corporate Services unite to help build our competitive advantage with first class support internally. Spanning Administration, Business Development, Chairman's Office, Compliance, Finance, Human Resources, Learning and Development, Legal, Marketing and Communications, Operations and Change Management, and Technology, we power our lines of services to make sure all of us have the right sources, services and technology to be the best we can be.
Not all of us work directly with external clients. Some of our most talented people choose to harness their skills, experience, expertise and service excellence within PwC. The possibilities are endless and our business landscape is changing everyday.
Help us keep our workpapers organised, compliant, and easy to find. You'll review access logs and permission rights to identify the owner of each folder, confirm the correct data-retention rules with those owners, and move data to the right storage location using our in-house tools. This role suits someone meticulous, persistent, and comfortable following up with busy stakeholders.
In this role you will contribute to the following areas:
Identification of data Owners
Analyse access logs and current permission rights to determine the accountable owner for each folder/data set.
- Validate findings with the data owners and the designated person in charge in each business unit.
Confirmation of applicable Retention policies
Reach out to each confirmed owner to determine the applicable retention schedule for their data set (e.g., active, archive, restricted, disposal eligibility).
- Record owner confirmations, rationale, and required holds/exemptions.
Transfer of Data & management of storage
Use in-house tools to migrate data to the correct storage location based on the confirmed retention and confidentiality.
- Maintain an auditable trail (checksums, timestamps, operator ID) of the data transfer or deletion performed during the clean up
Quality, documentation & controls
Maintain a live inventory of folders, owners, retention status, and transfer outcomes.
- Track and remediate exceptions (missing owners, disputed retention, failed transfers).
Stakeholder coordination
Proactively follow up with data owners to obtain timely responses.
- Escalate risks (e.g., orphaned data, sensitive content in the wrong tier) per playbook.
Requirements (must-have)
- Strong attention to detail and ability to sustain focus on repetitive, accuracy-critical tasks.
- Self-motivation and persistence in obtaining responses from busy stakeholders.
- Working knowledge of file/folder permission concepts (e.g., owners, groups, read/write, inheritance) and access logs.
- Proficiency with Excel/CSV and basic data hygiene (sorting, deduping, lookups).
- Clear written and verbal communication; confident, polite follow-ups.
Nice to have
- Exposure to records management or data retention concepts/policies.
- Familiarity with common storage platforms (e.g., SharePoint/OneDrive, network shares) and permission models.
- Basic scripting for automation (PowerShell or Python) and/or SQL for simple queries.
- Understanding of confidentiality labels/classifications and Singapore PDPA considerations.
Key performance indicators
- % of folders with confirmed owner and documented retention rule.
- Accuracy rate of owner/retention assignments (via spot checks/audits).
- Timeliness of data transfers and success rate (zero-error migrations).
- Reduction in orphaned data and over-privileged permissions .
- Documentation completeness (audit-ready inventory with evidence trail).
Tools you'll use
- In-house data discovery, migration, and logging utilities.
- Excel/CSV; ticketing or workflow tools (e.g., JIRA/ServiceNow equivalent); email/teams for owner outreach.
Working style
- Structured, checklist-driven execution with high accuracy.
- Comfortable managing a queue of outreach threads and chasing responses respectfully.
- Escalates early when risk, ambiguity, or delays are identified.
Education (if blank, degree and/or field of study not specified)
Degrees/Field of Study required:Degrees/Field of Study preferred:
Certifications (if blank, certifications not specified)
Required Skills
Optional Skills
Accepting Feedback, Active Listening, Communication, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Teamwork, Well Being
Desired Languages (If blank, desired languages not specified)
Travel Requirements
0%
Available for Work Visa Sponsorship?
No
Government Clearance Required?
No
Job Posting End Date
Backend Software Engineer - TikTok Data Ecosystem (Storage)
Posted 2 days ago
Job Viewed
Job Description
About TikTok
TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and we also have offices in New York City, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.
Why Join Us
Inspiring creativity is at the core of TikTok's mission. Our innovative product is built to help people authentically express themselves, discover and connect – and our global, diverse teams make that possible. Together, we create value for our communities, inspire creativity and bring joy - a mission we work towards every day.
We strive to do great things with great people. We lead with curiosity, humility, and a desire to make impact in a rapidly growing tech company. Every challenge is an opportunity to learn and innovate as one team. We're resilient and embrace challenges as they come. By constantly iterating and fostering an "Always Day 1" mindset, we achieve meaningful breakthroughs for ourselves, our company, and our users. When we create and grow together, the possibilities are limitless. Join us.
Diversity & Inclusion
TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.
Job highlights
Career growth opportunity, Paid leave, Flat organization
Responsibilities
About The Team
The TikTok Data Ecosystem Team has the vital role of crafting and implementing a storage solution for offline data in TikTok's recommendation system, which caters to more than a billion users. Their primary objectives are to guarantee system reliability, uninterrupted service, and seamless performance. They aim to create a storage and computing infrastructure that can adapt to various data sources within the recommendation system, accommodating diverse storage needs. Their ultimate goal is to deliver efficient, affordable data storage with easy-to-use data management tools for the recommendation, search, and advertising functions.
What you will be doing:
1. Responsible for the design and development of distributed database Hbase-related components.
2. Responsible for the design and development of single-node LSM engine Rocksdb-related components.
Qualifications
Minimum Qualifications:
- Bachelor's Degree or above, majoring in Computer Science, or related fields,
- Proficiency in one of C/C++/Java;
- 2+ years of relevant development experience in storage engines resource management, system tuning, capacity planning (e.g., RocksDB, Redis)
- 2+ years experience in distributed systems and big data technology like Kafka and Hadoop
Preferred Qualifications:
- Embrace the use of open-source software, possess a track record of involvement in open-source projects, and demonstrate a keen enthusiasm for engaging with the latest and most advanced technologies.
- Exhibit knowledge in distributed consensus algorithms like Paxos/Raft.
- Show familiarity with distributed transaction models.
- Demonstrate proficiency in typical storage engines, including RocksDB, and have a deep understanding of the inner workings of Redis at the low-level code level.
- Display expertise in low-level aspects of operating systems, with a background in optimizing system performance for TCP/IP, I/O operations, and other critical component
Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled data engineer to collaborate with agencies on various projects, leveraging expertise in system design, data structure, and algorithms. This role involves working directly with clients to translate business requirements into technical specifications.
The ideal candidate will have experience in cloud technologies such as AWS, Azure, and Google Cloud, as well as proficiency in using Databricks. Additionally, they should be knowledgeable about designing, building, and maintaining batch and real-time data pipelines.
A bachelor's degree in computer science, software engineering, information technology, or a related field is required. Minimum 7 years of relevant experience is necessary for this position.
Responsibilities- Collaborate with clients to understand their data needs and translate them into technical specifications.
- Design and build data products that meet client requirements, including ingestion pipelines, data storage solutions, and data visualization tools.
- Maintain existing data systems, ensuring optimal performance and scalability.
- Develop new technologies and processes to improve agency data infrastructure.
- Bachelor's degree in computer science, software engineering, information technology, or a related field.
- Minimum 7 years of relevant experience.
- Deep understanding of system design, data structure, and algorithms.
- Experience in cloud technologies such as AWS, Azure, and Google Cloud.
- Proficiency in using Databricks.
- Knowledgeable about designing, building, and maintaining batch and real-time data pipelines.
This role offers a competitive salary and excellent benefits package, including health insurance, retirement plan, and paid time off.
How to ApplyIf you are a motivated and detail-oriented individual with a passion for data engineering, please submit your application, including your resume and a cover letter explaining why you are the ideal candidate for this role.
Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
Seeking an experienced Data Engineer to drive enterprise AI and data initiatives forward. This role focuses on developing scalable data infrastructure, migrating legacy systems, and enabling advanced analytics and machine learning use cases.
Key Responsibilities:
- Design and implement ETL/ELT pipelines to support AI/ML models and business intelligence
- Migrate legacy data platforms to a modern Snowflake-based architecture
- Ensure secure, high-performance infrastructure aligned with data governance policies
- Collaborate with cross-functional teams including data scientists and business stakeholders
- Support real-time data processing and integration of unstructured data sources
- Contribute to AI model deployment by preparing datasets and ensuring seamless system integration
Technical Environment:
- Platforms: Snowflake, AWS (Glue, S3, RDS, Fargate), Azure
- Tools: dbt, MLflow, LangChain, Docker
- Languages: SQL, Python, Java
- AI/ML: TensorFlow, PyTorch, LLM pipelines, vector databases
Requirements:
- More than 4 years of hands-on experience in data engineering
- Experienced in Snowflake (SnowPro certification preferred)
- Strong knowledge of cloud-based data architecture and data warehouse migration
- Experience with ETL/ELT frameworks and MLOps tools
- Familiarity with data security, governance, and performance optimization
- Ability to collaborate effectively with technical and non-technical stakeholders
- Bachelor's or Master's degree in Computer Science, Engineering, or a related field
Leading Data Infrastructure Innovator
Posted today
Job Viewed
Job Description
**Data Engineer Role Overview:**
We are seeking an experienced Data Engineer to join our team. The ideal candidate will have a strong background in developing and maintaining large-scale data infrastructure, as well as experience with data modeling, architecture, and governance.
**Key Responsibilities:**
- Design and develop new data pipelines and architectures to support business growth and improve data quality.
- Collaborate with cross-functional teams to identify and prioritize data-related initiatives.
- Develop and maintain data models and metadata to ensure data accuracy and consistency.
- Implement data governance policies and procedures to ensure compliance with regulatory requirements.
**Requirements:**
- 5+ years of experience in data engineering, with a focus on developing and maintaining large-scale data infrastructure.
- Strong skills in SQL and Python, with experience in Spark or distributed computing.
- Experience with data modeling and architecture concepts.
- Experience with data platform tools like Databricks, Snowflake, Cloudera, etc.
- Experience with cloud providers like AWS, Azure, or GCP.
**Nice to Have:**
- Knowledge of data governance principles and practices.
- Experience with data security and access control.
- Ability to work independently and collaboratively as part of a team.
**What We Offer:**
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- A dynamic and collaborative work environment.
Global Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
**Job Title:** Data Center Facility Engineer
Description:Data Center Facility Engineers are responsible for critical facility operations of overseas data centers. They manage changes on critical systems, respond to emergent failures, and track preventative maintenance.
Key Responsibilities:
- Manage capacity management of facility systems including power and cooling.
- Oversee availability risk of facility systems and drive resolution.
- Drive projects to improve capacity, efficiency, and reliability of current facility systems.
- Support IT managers with their projects and operations.
- Assist design and project teams on new site/data-hall construction and commissioning.
- Establish performance benchmarks, conduct analyses, and prepare reports on all aspects of critical facility infrastructure operations and maintenance.
To be successful in this role, you will need:
- Bachelor's degree in electrical engineering or a related field.
- 3+ years of critical facility management or operation experience in large-scale data centers or 5+ years of large-scale production facility management experience in large plants.
- Experience with MEP equipment such as UPS, generators, chillers, pumps, cooling towers, etc.
- Strong communication skills in English and attention to detail to identify risks and resolve them.
- Ability to communicate in Chinese and maintain a safe and efficient working environment.
Chief Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
As a Chief Data Infrastructure Specialist, you will design and implement scalable data storage solutions to ensure the integrity and dependability of large datasets.
Key responsibilities include designing and implementing data pipelines for efficient data ingestion, processing, and transformation. This includes leveraging Azure Data Factory or comparable technologies to create complex data models for analytics purposes and assemble large, complex data sets using Databricks.
Other key tasks include ensuring data security and compliance, designing validation and cleansing procedures, and collaborating with cross-functional teams to drive business outcomes through data-driven insights.
- Design and implement data storage solutions
- Develop data pipelines for efficient data ingestion and transformation
- Create data models for analytics purposes
- Assemble large, complex data sets
- Implement data validation and cleansing procedures
- Ensure data security and compliance
Required Skills:
- Azure
- Big Data
- Pipelines
- Scrum
- Hadoop
- ETL
- SQL
- Networking
- SQL Server
- Python
- Business Process
- Data Analytics
- Databases
- Linux
- Analysis Services
Be The First To Know
About the latest Data storage Jobs in Singapore !
Real-Time Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking an experienced Real-Time Data Infrastructure Specialist to join our team. The successful candidate will be responsible for designing and implementing data acquisition interfaces for contextualized data, ensuring seamless integration between systems.
- Built upon a strong foundation of SQL query development, the ideal candidate will design and refine queries to extract, transform, and manage production and system data with ease.
- The role requires collaboration with cross-functional teams to drive investigations and design activities related to Plug & Produce (P&P) for MES and Historian platform integration.
- A key aspect of the job is reviewing and validating technical documentation for system interfaces to ensure accuracy and compliance.
- In this position, you will support validation activities by executing and documenting test cases to guarantee system readiness and adherence to regulatory requirements.
- You will work closely with Digital, Automation, MES, MSAT, and Global teams to analyze and resolve application incidents, providing timely follow-up and resolution to ensure minimal disruption.
- The ability to troubleshoot and resolve connectivity and data flow issues across MES, PLC, DCS, and Historian systems is crucial for success in this role.
- Your responsibility will include monitoring performance and availability of Historian platforms to maintain stability and reliability.
- You will also perform regular maintenance tasks such as log reviews, job monitoring, and health checks to prevent potential issues.
- Lastly, the role requires ensuring robust backup and recovery processes are in place and validated to mitigate data loss risks.
AI-Powered Data Infrastructure Specialist
Posted today
Job Viewed
Job Description
We are seeking a Data Engineer to build and maintain data infrastructure that supports AI-powered detection of unauthorized privileged access.
Technical Product Manager - Data Infrastructure
Posted today
Job Viewed
Job Description
Technical Product Manager - Data Infrastructure
Our team is responsible for building a stable, efficient, secure, and easy-to-use big data infrastructure and platform for the company. We provide data storage, calculation, query, and analysis capabilities for business teams, data teams, data analysts, machine learning teams, BI teams, and others. This includes data collection, storage, offline calculation of massive data, real-time stream computing, online analysis and processing, instant return query, and other data infrastructure support, as well as big data development, production scheduling, data quality monitoring, and data maps. We provide a basic platform for computing scheduling, batch computing, real-time computing, structured data query, big data analysis, KV query, and help business teams build data reports, dashboards, real-time data processing, data mining, and analysis.
Job Description
Manage one of the Data Suite products, including but not limited to Data Map, Data Integration, Data Development, Data Management, Data Quality, Data Services, Data Metrics, Data Visualisation, etc.
Responsible for requirements analysis, product design, prototype development, as well as product operation.
Understand user needs and big data use cases, abstract complex requirements into solutions, and implement them in the product to empower users.
Develop and maintain user relationships and communication channels with users; synthesize product features from user requirements and feedback.
Explore and continuously learn best practices from the big data industry, and plan the product blueprints in the long term.
Requirements
Bachelor's Degree or higher
At least 3 years of experience in product management, preferably data-related products.
Familiarity with data management, data development and data application use cases and products.
Excellent product skills and problem-solving capabilities.
Clear logical thinking and a service-oriented mindset.
Seniority level
Mid-Senior level
Employment type
Full-time
Job function
Information Technology
Industries
Internet Marketplace Platforms and Technology, Information and Internet
#J-18808-Ljbffr