542 Big Data jobs in Singapore
Big Data Engineer
Posted 15 days ago
Job Viewed
Job Description
We are seeking a highly skilled and experienced Big Data Engineer to join our team. The ideal candidate will have a minimum of 4 years of experience managing data engineering jobs in big data environment e.g., Cloudera Data Platform. The successful candidate will be responsible for designing, developing, and maintaining the data ingestion and processing jobs. Candidate will also be integrating data sets to provide seamless data access to users.
1.
Responsibilities
• Analyse the Authority’s data needs and document the requirements.
• Refine data collection/consumption by migrating data collection to more efficient channels.
• Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs.
• Develop test plan and scripts for system testing, support user acceptance testing.
• Build reports and dashboards according to user requirements
• Work with the Authority’s technical teams to ensure smooth deployment and adoption of new solution.
• Ensure the smooth operations and service level of IT solutions.
• Support production issues
2.
What we are looking for
• Good understanding and completion of projects using waterfall/Agile methodology.
• Strong SQL, data modelling and data analysis skills are a must.
• Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica.
• Hands-on experience in a reporting or visualization tool like SAP BO and Tableau is must.
• Hands-on experience in DevOps deployment and data virtualisation tools like Denodo will be an advantage.
• Track record in implementing systems using Hive, Impala and Cloudera Data Platform will be preferred.
• Good understanding of analytics and data warehouse implementations.
• Ability to troubleshoot complex issues ranging from system resource to application stack traces.
• Track record in implementing systems with high availability, high performance, high security hosted at various data centres or hybrid cloud environments will be an added advantage.
• Passion for automation, standardization, and best practices
Big Data Engineer (Singapore)
Posted today
Job Viewed
Job Description
Direct message the job poster from Baidu, Inc.
-Build the company's big data warehouse system, including batch and stream data flow construction;
-In-depth understanding of business systems, understanding of project customer needs, design and implement big data systems that meet user needs, and ensure smooth project acceptance
- Responsible for data integration and ETL architecture design and development
- Research cutting-edge big data technologies, optimize big data platforms continuously
Job requirements:
-Bachelor degree or above in computer, database, machine learning and other related majors, more than 3 years of data development work;
-Have a keen understanding of business data, and can analyze business data quickly;
-Proficient in SQL language, familiar with MySQL, very familiar with Shell, Java (Scala), Python;
-Familiar with common ETL technologies and principles; proficient in data warehouse database design specifications and practical operations; rich experience in spark, MR task tuning
-Familiar with Hadoop, Hive, HBase, Spark, flink, kafka and other big data components ;
-Proficient in the use of mainstream analytical data systems such as clickhouse、doris、 TIDB, have tuning experience are preferredBig Data Development Engineer
Seniority level- Seniority level Mid-Senior level
- Employment type Full-time
- Job function Information Technology and Engineering
- Industries Software Development, Technology, Information and Media, and Information Services
Referrals increase your chances of interviewing at Baidu, Inc. by 2x
Get notified about new Big Data Developer jobs in Singapore, Singapore .
Data Engineer (Snowflake/Microsoft Fabric), AI & Data, Technology ConsultingSouth East Community Development Council, Singapore 1 day ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrPrincipal Big Data Engineer
Posted today
Job Viewed
Job Description
Design and implement scalable data pipelines to collect, process, and store large volumes of structured and unstructured data.
Ensure data quality, integrity, and consistency across various data sources.
Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
Integrate data from multiple sources, including databases, APIs, and third-party data providers.
Build and maintain big data infrastructure using technologies
Optimize data storage and retrieval processes to improve performance and reduce costs.
Proven experience as a Big Data Engineer or similar role.
Experience with big data technologies such as Hadoop, Spark, Kafka, Hive, HBase, etc
Proficiency in programming languages such as Python, Java, or Scala.
Strong knowledge of SQL and NoSQL databases.
Knowledge of machine learning and data analytics.
Big Data Engineer - TikTok
Posted 3 days ago
Job Viewed
Job Description
About TikTok
TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and we also have offices in New York City, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.
Why Join Us
Inspiring creativity is at the core of TikTok's mission. Our innovative product is built to help people authentically express themselves, discover and connect – and our global, diverse teams make that possible. Together, we create value for our communities, inspire creativity and bring joy - a mission we work towards every day.
We strive to do great things with great people. We lead with curiosity, humility, and a desire to make impact in a rapidly growing tech company. Every challenge is an opportunity to learn and innovate as one team. We're resilient and embrace challenges as they come. By constantly iterating and fostering an "Always Day 1" mindset, we achieve meaningful breakthroughs for ourselves, our company, and our users. When we create and grow together, the possibilities are limitless. Join us.
Diversity & Inclusion
TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.
Job highlights
Positive team atmosphere, Career growth opportunity, Paid leave, 100+ mil users, Meals provided
Responsibilities
Team Introduction
The mission of the Data Platform Singapore Business Partnering (DPSG BP) team is to empower the TikTok Business with data. Our goal is to build a Data Warehouse that can cater to batch and streaming data, Data Products that provide useful information to build efficient data metrics & dashboards which will be used to make smarter business decisions to support business growth. If you're looking for a challenging ground to push your limits, this is the team for you!
Responsibilities:
- Translate business requirements & end to end designs into technical implementations and responsible for building batch and real-time data warehouse.
- Manage data modeling design, writing, and optimizing ETL jobs.
- Collaborate with the business team to building data metrics based on data warehouse.
- Responsible for building and maintaining data products.
- Involvement in rollouts, upgrades, implementation, and release of data system changes as required for streamlining of internal practices.
- Develop and implement techniques and analytics applications to transform raw data into meaningful information using data-oriented programming languages and visualisation software.
- Apply data mining, data modelling, natural language processing, and machine learning to extract and analyse information from large structured and unstructured datasets.
- Visualise, interpret, and report data findings and may create dynamic data reports as well.
Qualifications
Minimum Qualifications:
- At least 5 years in software engineering and 2 years of relevant experience in data engineering.
- Proficient in creating and maintaining complex ETL pipelines end-to-end while maintaining high reliability and security.
- Familiar with data warehouse concept and have production experience in modeling design.
- Familiar with at least 1 distributed computing engine (e.g. Hive, Spark, Flink).
- Familiar with at least 1 NoSQL database is a plus (e.g. HBase).
Preferred Qualifications:
- Excellent interpersonal and communication skills with the ability to engage and manage internal and external stakeholders across all levels of seniority.
- Strong collaboration skills with the ability to build rapport across teams and stakeholders.
Lead Big Data Engineer
Posted 8 days ago
Job Viewed
Job Description
What's on offer:
· Location: Singapore
· Client: End client user environment
Job Summary:
We are seeking a highly skilled and motivated Lead Big Data Engineer to join our data team. The ideal candidate will play a key role in designing, developing, and maintaining scalable big data solutions while providing technical leadership. This role will also support strategic Data Governance initiatives, ensuring data integrity, privacy, and accessibility across the organization.
Key Responsibilities:
· Design, implement, and optimize robust data pipelines and ETL/ELT workflows using SQL and Python .
· Lead architecture discussions, including the creation and review of Entity Relationship Diagrams (ERDs) and overall system design.
· Collaborate closely with Data Engineers, Analysts, and cross-functional engineering teams to meet evolving data needs.
· Deploy and manage infrastructure using Terraform and other Infrastructure-as-Code (IaC) tools.
· Develop and maintain CI/CD pipelines for deploying data applications and services.
· Leverage strong experience in AWS services (e.g., S3, Glue, Lambda, RDS, Lake Formation) to support scalable and secure cloud-based data platforms.
· Handle both batch and real-time data processing effectively.
· Apply best practices in data modeling and support data privacy and data protection initiatives.
· Implement and manage data encryption and hashing techniques to secure sensitive information.
· Ensure adherence to software engineering best practices including version control, automated testing, and deployment standards.
· Lead performance tuning and troubleshooting for data applications and platforms.
Required Skills & Qualifications:
· Strong proficiency in SQL for data modeling, querying, and transformation.
· Advanced Python development skills with an emphasis on data engineering use cases.
· Hands-on experience with Terraform for cloud infrastructure provisioning.
· Proficiency with CI/CD tools, particularly GitHub Actions .
· Deep expertise in AWS cloud architecture and services.
· Demonstrated ability to create and evaluate ERDs and contribute to architectural decisions.
· Strong communication and leadership skills with experience mentoring engineering teams.
Preferred Skills:
· Experience with big data technologies such as Apache Spark , Hive , or Kafka .
· Familiarity with containerization tools (e.g., Docker) and orchestration platforms (e.g., Kubernetes).
· Solid understanding of data governance, data quality, and security frameworks
Principal Big Data Engineer
Posted 8 days ago
Job Viewed
Job Description
Design and implement scalable data pipelines to collect, process, and store large volumes of structured and unstructured data.
Ensure data quality, integrity, and consistency across various data sources.
Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver solutions that meet business needs.
Integrate data from multiple sources, including databases, APIs, and third-party data providers.
Build and maintain big data infrastructure using technologies
Optimize data storage and retrieval processes to improve performance and reduce costs.
Proven experience as a Big Data Engineer or similar role.
Experience with big data technologies such as Hadoop, Spark, Kafka, Hive, HBase, etc
Proficiency in programming languages such as Python, Java, or Scala.
Strong knowledge of SQL and NoSQL databases.
Knowledge of machine learning and data analytics.
Lead Big Data Engineer
Posted 12 days ago
Job Viewed
Job Description
- Experience : 10+ years
- (SQL Server / Oracle / DB2 / Netezza) – at-least good working knowledge in 2 of these DBApache Spark Streaming or Apache Flink
- Kafka
- NOSQL databases - Cosmos DB, Document DB
- Spark,Dataframe API
- Hive (HQL)
- Scripting language – Shell or bash
- CI CD
- Experience with at least one Cloud Infra provider (Azure/AWS)
Good to have Skills :
- Certifications related to Data and Analytics
Be The First To Know
About the latest Big data Jobs in Singapore !
Big Data Engineer - Data Platform
Posted 3 days ago
Job Viewed
Job Description
About TikTok
TikTok is the leading destination for short-form mobile video. At TikTok, our mission is to inspire creativity and bring joy. TikTok's global headquarters are in Los Angeles and Singapore, and we also have offices in New York City, London, Dublin, Paris, Berlin, Dubai, Jakarta, Seoul, and Tokyo.
Why Join Us
Inspiring creativity is at the core of TikTok's mission. Our innovative product is built to help people authentically express themselves, discover and connect – and our global, diverse teams make that possible. Together, we create value for our communities, inspire creativity and bring joy - a mission we work towards every day.
We strive to do great things with great people. We lead with curiosity, humility, and a desire to make impact in a rapidly growing tech company. Every challenge is an opportunity to learn and innovate as one team. We're resilient and embrace challenges as they come. By constantly iterating and fostering an "Always Day 1" mindset, we achieve meaningful breakthroughs for ourselves, our company, and our users. When we create and grow together, the possibilities are limitless. Join us.
Diversity & Inclusion
TikTok is committed to creating an inclusive space where employees are valued for their skills, experiences, and unique perspectives. Our platform connects people from across the globe and so does our workplace. At TikTok, our mission is to inspire creativity and bring joy. To achieve that goal, we are committed to celebrating our diverse voices and to creating an environment that reflects the many communities we reach. We are passionate about this and hope you are too.
Job highlights
Positive team atmosphere, Career growth opportunity, Flat organization
Responsibilities
About the team
The mission of the Data Platform Singapore Business Partnering (DPSG BP) team is to empower the TikTok Business with data. Our goal is to build a Data Warehouse that can cater to batch and streaming data, Data Products that provide useful information to build efficient data metrics & dashboards which will be used to make smarter business decisions to support business growth. If you're looking for a challenging ground to push your limits, this is the team for you!
As a data engineer in the Data Platform team, you will have the opportunity to build, optimize and grow one of the largest data platforms in the world. You'll have the opportunity to gain hands-on experience on all kinds of systems in the data platform ecosystem. Your work will have a direct and huge impact on the company's core products as well as hundreds of millions of users.
Responsibilities:
- Design and build data transformations efficiently and reliably for different purposes (e.g. reporting, growth analysis, and multi-dimensional analysis).
- Design and implement reliable, scalable, robust and extensible big data systems that support core products and business.
- Establish solid design and best engineering practice for engineers as well as non-technical people.
- Develop and implement techniques and analytics applications to transform raw data into meaningful information using data-oriented programming languages and visualisation software.
- Apply data mining, data modelling, natural language processing, and machine learning to extract and analyse information from large structured and unstructured datasets.
- Visualise, interpret, and report data findings and may create dynamic data reports as well.
Qualifications
Minimum Qualifications:
- Bachelor or Masters degree in Computer Science or related technical field or equivalent practical experience.
- At least 3 years of experience in the Big Data technologies (Hadoop, M/R, Hive, Spark, Metastore, Presto, Flume, Kafka, ClickHouse, Flink etc.).
Preferred Qualifications:
- Experience with performing data analysis, data ingestion and data integration.
- Experience with schema design and data modeling.
- Experience with ETL (Extraction, Transformation & Loading) and architecting data systems.
- Experience in writing, analyzing and debugging SQL queries.
- Experience in data privacy and security related projects.
- Deep understanding of various Big Data technologies.
- Passionate and self-motivated about technologies in the Big Data area.
- Solid communication and collaboration skills.
Big Data Test Engineer
Posted today
Job Viewed
Job Description
Join to apply for the Big Data Test Engineer role at Patsnap .
About Patsnap
Patsnap empowers IP and R&D teams by providing better answers, enabling faster decision-making with confidence. Founded in 2007, Patsnap is a global leader in AI-powered IP and R&D intelligence. Our domain-specific LLM, trained on extensive proprietary innovation data, combined with Hiro, our AI assistant, delivers actionable insights that boost productivity and reduce R&D wastage. Trusted by over 15,000 companies including NASA, Tesla, PayPal, Sanofi, Dow Chemical, and Wilson Sonsini, Patsnap helps teams innovate faster.
About The Role
We are seeking a Big Data Test Engineer to ensure the quality and reliability of our data platforms. You will be responsible for testing and validating our big data solutions, working with technologies like Hadoop and Spark, and collaborating with our data engineering team. If passionate about automation testing and big data platform development, and eager to grow your skills, this role is ideal for you.
Key Responsibilities
- Test and ensure quality of big data and data-related products.
- Develop testing frameworks or tools, and contribute to CI platforms and automation.
- Validate data quality, consistency, and accuracy across pipelines.
- Analyze test results and generate reports on software quality and coverage.
- Monitor and validate data workflows and pipelines.
- Improve testing processes for better efficiency.
- Ensure data integrity across sources and platforms.
- Collaborate with data engineers to enhance data quality.
Desired Qualifications
- Bachelor's degree in Computer Science, IT, or related field.
- 2+ years of software testing experience.
- Proficiency in Python or Java, with SQL and data validation skills.
- Experience with test automation tools and frameworks.
- Knowledge of version control (e.g., Git) and CI/CD tools.
- Strong communication skills.
- Ability to work independently and as part of a team.
- Attention to detail and quality focus.
Preferred Skills
- Experience with Hadoop, Spark, Hive, HBase.
- Knowledge of ETL and data warehousing.
- Experience with performance testing tools like JMeter.
- Cloud platform experience (AWS, Azure, GCP).
- ISTQB certification or equivalent.
Why Join Us
- Work with cutting-edge big data technologies.
- Collaborative and innovative environment.
- Opportunities for professional growth.
- Regular team events and knowledge sharing.
Not Applicable
Employment typeFull-time
Job functionEngineering and Information Technology
IndustriesSoftware Development
Referrals can double your chances of interviewing at Patsnap. Stay updated on new Big Data Developer jobs in Singapore .
#J-18808-LjbffrBig Data Test Engineer
Posted today
Job Viewed
Job Description
About PatSnap
Patsnap empowers IP and R&D teams by providing better answers, so they can make faster decisions with more confidence. Founded in 2007, Patsnap is the global leader in AI-powered IP and R&D intelligence. Our domain-specific LLM, trained on our extensive proprietary innovation data, coupled with Hiro, our AI assistant, delivers
actionable insights that increase productivity for IP tasks by 75% and reduce R&D wastage by 25%. IP and R&D teams collaborate better with a user-friendly platform across the entire innovation lifecycle. Over 15,000 companies trust Patsnap to innovate faster with AI, including NASA, Tesla, PayPal, Sanofi, Dow Chemical, and
Wilson Sonsini.
About the Role
We are looking for a Big Data Test Engineer to ensure the quality and reliability of our data platforms. The successful candidate will be responsible for ensuring the quality and reliability of our big data solutions through rigorous testing and validation processes, working with technologies like Hadoop and Spark while collaborating with our data engineering team. If you are passionate about automation testing, and big data testing platform development and are eager to grow your technical skills, this opportunity is for you.
Key Responsibilities- Responsible for testing and quality assurance of big data and data-related products.
- Participate in the development of big data testing frameworks or testing tools, and contribute to the construction of continuous integration platforms and automation development.
- Validate data quality, consistency, and accuracy across data pipelines.
- Analyze test results and provide detailed reports on software quality and test coverage.
- Monitor and validate data workflows and pipelines.
- Continuously improve testing processes and methodologies to enhance efficiency and effectiveness.
- Ensure data integrity and accuracy across various data sources and platforms.
- Work closely with data engineers to improve data quality.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- 2+ years of experience in software testing.
- Proficiency in programming languages such as Python or Java, experience with SQL and data querying for data validation.
- Familiarity with test automation tools and frameworks.
- Knowledge of version control systems (e.g., Git) and experience with CI/CD tools.
- Strong communication and interpersonal skills.
- Ability to work independently and collaboratively in a team environment.
- Attention to detail and a commitment to quality.
- Experience with big data platforms (Hadoop, Spark, Hive, HBase).
- Knowledge of ETL processes and data warehousing concepts.
- Experience with performance testing tools (JMeter).
- AWS/Azure/GCP cloud platform experience.
- ISTQB certification or equivalent.
Why Join Us
- Work with cutting-edge big data technologies and tools.
- Collaborative and innovative work environment.
- Professional growth and learning opportunities.
- Regular team events and knowledge sharing sessions.
#J-18808-Ljbffr