117 Big Data Tools jobs in Singapore
Big Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Job Description:
- We are seeking a highly skilled and motivated Big Data Engineer to join our data team. The ideal candidate will play a key role in designing, developing, and maintaining scalable big data solutions while providing technical leadership. This role will also support strategic Data Governance initiatives, ensuring data integrity, privacy, and accessibility across the organization.
Key Responsibilities:
- Design, implement, and optimize robust data pipelines and ETL/ELT workflows using SQL and Python.
- Collaborate closely with Data Engineers, Analysts, and cross-functional engineering teams to meet evolving data needs.
- Build synchronous and asynchronous data APIs for downstream systems to consume the data.
- Deploy and manage infrastructure using Terraform and other
- Infrastructure-as-Code (IaC) tools.
- Develop and maintain CI/CD pipelines for deploying data applications and services.
- Leverage strong experience in AWS services (e.g., S3, Glue, Lambda, RDS, Lake
- Formation) to support scalable and secure cloud-based data platforms.
- Handle both batch and real-time data processing effectively.
- Apply best practices in data modeling and support data privacy and data protection initiatives.
- Implement and manage data encryption and hashing techniques to secure sensitive information.
- Ensure adherence to software engineering best practices including version control, automated testing, and deployment standards.
- Lead performance tuning and troubleshooting for data applications and platforms.
Required Skills & Experience:
- Strong proficiency in SQL for data modeling, querying, and transformation.
- Advanced Python development skills with an emphasis on data engineering use cases.
- Hands-on experience with Terraform for cloud infrastructure provisioning.
- Proficiency with CI/CD tools, particularly GitHub Actions.
- Deep expertise in AWS cloud architecture and services.
- Demonstrated ability to create and evaluate ERDs and contribute to architectural decisions.
- Strong communication.
Preferred Qualifications:
- Experience with big data technologies such as Apache Spark, Hive, or Kafka.
- Familiarity with containerization tools (e.g., Docker) and orchestration platforms (e.g., Kubernetes) .
- Solid understanding of data governance, data quality, and security frameworks.
Big Data Engineer
Posted 11 days ago
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
We are seeking a highly skilled and experienced Big Data Engineer to join our team. The ideal candidate will have a minimum of 5 years of experience managing data engineering jobs in big data environment e.g., Cloudera Data Platform. The successful candidate will be responsible for designing, developing, and maintaining the data ingestion and processing jobs. Candidate will also be integrating data sets to provide seamless data access to users.
Skills Set And Track Record
- Good understanding and completion of projects using waterfall/Agile methodology
- Analytical, conceptualisation and problem-solving skills
- Good understanding of analytics and data warehouse implementations
- Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica
- Strong SQL and data analysis skills. Hands-on experience in data virtualisation tools like Denodo will be an added advantage
- Hands-on experience in a reporting or visualization tool like SAP BO and Tableau is preferred
- Track record in implementing systems using Cloudera Data Platform will be an added advantage
- Motivated and self-driven, with ability to learn new concepts and tools in a short period of time
- Passion for automation, standardization, and best practices
- Good presentation skills are preferred
- Analyse the Client data needs and document the requirements
- Refine data collection/consumption by migrating data collection to more efficient channels
- Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs
- Develop test plan and scripts for system testing, support user acceptance testing
- Work with the Client technical teams to ensure smooth deployment and adoption of new solution
- Ensure the smooth operations and service level of IT solutions
- Support production issues
- Seniority level Mid-Senior level
- Employment type Contract
- Job function Information Technology
- Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Unison Consulting by 2x
Get notified about new Big Data Developer jobs in Singapore, Singapore .
Data Engineer (Snowflake/Microsoft Fabric), AI & Data, Technology ConsultingSouth East Community Development Council, Singapore 1 week ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBig Data Analyst
Posted 19 days ago
Job Viewed
Job Description
Job Responsibility:
- Work with data science team to complete game data analysis, including data logic combing, basic data processing, analysis and corresponding development work.
- Complete basic data analysis, machine learning analysis, and build the required data processing flow, data report visualization.
- Develop data processing pipelines for data modelling, analysis, and reporting from large and complex transaction datasets
- Ability to assist in supporting engineering development, data construction and maintenance when required.
Requirements:
- Degree in Computer Science or related technical field
- At least 2 years of experience in data analysis/data warehouse/mart development and BI reporting.
- At least 2 years of experience in ETL processing data.
- Good understanding of Python, SQL, HiveQL/SparkSQL and the relevant best practices/techniques for perf tuning, experience deploying models in production and adjusting model thresholds to improve performance is a plus.
- Familiarity with data visualization tools, such as Google Analytics or Tableau.
Big Data Engineer
Posted today
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
We are seeking a highly skilled and experienced Big Data Engineer to join our team. The ideal candidate will have a minimum of 5 years of experience managing data engineering jobs in big data environment e.g., Cloudera Data Platform. The successful candidate will be responsible for designing, developing, and maintaining the data ingestion and processing jobs. Candidate will also be integrating data sets to provide seamless data access to users.
Skills Set And Track Record
- Good understanding and completion of projects using waterfall/Agile methodology
- Analytical, conceptualisation and problem-solving skills
- Good understanding of analytics and data warehouse implementations
- Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica
- Strong SQL and data analysis skills. Hands-on experience in data virtualisation tools like Denodo will be an added advantage
- Hands-on experience in a reporting or visualization tool like SAP BO and Tableau is preferred
- Track record in implementing systems using Cloudera Data Platform will be an added advantage
- Motivated and self-driven, with ability to learn new concepts and tools in a short period of time
- Passion for automation, standardization, and best practices
- Good presentation skills are preferred
- Analyse the Client data needs and document the requirements
- Refine data collection/consumption by migrating data collection to more efficient channels
- Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs
- Develop test plan and scripts for system testing, support user acceptance testing
- Work with the Client technical teams to ensure smooth deployment and adoption of new solution
- Ensure the smooth operations and service level of IT solutions
- Support production issues
Seniority level
Seniority level
Mid-Senior level
Employment type
Employment type
Contract
Job function
Job function
Information TechnologyIndustries
IT Services and IT Consulting
Referrals increase your chances of interviewing at Unison Consulting by 2x
Get notified about new Big Data Developer jobs in Singapore, Singapore .
Data Engineer (Snowflake/Microsoft Fabric), AI & Data, Technology Consulting
South East Community Development Council, Singapore 1 week ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-LjbffrBig Data Analyst
Posted today
Job Viewed
Job Description
Job Responsibility:
- Work with data science team to complete game data analysis, including data logic combing, basic data processing, analysis and corresponding development work.
- Complete basic data analysis, machine learning analysis, and build the required data processing flow, data report visualization.
- Develop data processing pipelines for data modelling, analysis, and reporting from large and complex transaction datasets
- Ability to assist in supporting engineering development, data construction and maintenance when required.
Requirements:
- Degree in Computer Science or related technical field
- At least 2 years of experience in data analysis/data warehouse/mart development and BI reporting.
- At least 2 years of experience in ETL processing data.
- Good understanding of Python, SQL, HiveQL/SparkSQL and the relevant best practices/techniques for perf tuning, experience deploying models in production and adjusting model thresholds to improve performance is a plus.
- Familiarity with data visualization tools, such as Google Analytics or Tableau.
Big Data Engineer
Posted today
Job Viewed
Job Description
Job Description:
- We are seeking a highly skilled and motivated Big Data Engineer to join our data team. The ideal candidate will play a key role in designing, developing, and maintaining scalable big data solutions while providing technical leadership. This role will also support strategic Data Governance initiatives, ensuring data integrity, privacy, and accessibility across the organization.
Key Responsibilities:
- Design, implement, and optimize robust data pipelines and ETL/ELT workflows using SQL and Python.
- Collaborate closely with Data Engineers, Analysts, and cross-functional engineering teams to meet evolving data needs.
- Build synchronous and asynchronous data APIs for downstream systems to consume the data.
- Deploy and manage infrastructure using Terraform and other
- Infrastructure-as-Code (IaC) tools.
- Develop and maintain CI/CD pipelines for deploying data applications and services.
- Leverage strong experience in AWS services (e.g., S3, Glue, Lambda, RDS, Lake
- Formation) to support scalable and secure cloud-based data platforms.
- Handle both batch and real-time data processing effectively.
- Apply best practices in data modeling and support data privacy and data protection initiatives.
- Implement and manage data encryption and hashing techniques to secure sensitive information.
- Ensure adherence to software engineering best practices including version control, automated testing, and deployment standards.
- Lead performance tuning and troubleshooting for data applications and platforms.
Required Skills & Experience:
- Strong proficiency in SQL for data modeling, querying, and transformation.
- Advanced Python development skills with an emphasis on data engineering use cases.
- Hands-on experience with Terraform for cloud infrastructure provisioning.
- Proficiency with CI/CD tools, particularly GitHub Actions.
- Deep expertise in AWS cloud architecture and services.
- Demonstrated ability to create and evaluate ERDs and contribute to architectural decisions.
- Strong communication.
Preferred Qualifications:
- Experience with big data technologies such as Apache Spark, Hive, or Kafka.
- Familiarity with containerization tools (e.g., Docker) and orchestration platforms (e.g., Kubernetes) .
- Solid understanding of data governance, data quality, and security frameworks.
Big Data Engineer
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineer to join our team.
Job Description:In this role, you will be responsible for designing and building robust data pipelines to ingest, transform, and load high-volume, high-velocity data from various sources. You will work with multiple technologies including Apache NiFi, Spark, Airflow, Flink, Kafka, and Talend to create efficient ETL processes.
Key Responsibilities:- Design and build scalable and fault-tolerant ETL pipelines using various tools and technologies.
- Develop batch and real-time data flows using frameworks such as Python, Go, Java, and Ruby.
- Optimize ETL jobs for performance, scalability, and low latency.
- Implement data validation, cleansing, and normalization processes for consistent AI model input.
- Integrate with AIOps platforms and ML pipelines using REST APIs or event-driven architectures.
To succeed in this role, you must have:
- Proficiency in one or more programming languages: Python, Go, Java, Ruby, JavaScript/TypeScript.
- Experience with ETL frameworks: Apache NiFi, Spark, Airflow, Flink, Kafka, Talend.
- Knowledge of data storage solutions: PostgreSQL, MongoDB, Elasticsearch, Snowflake, BigQuery.
- Understanding of streaming platforms: Kafka, Kinesis, Pub/Sub.
- Ability to integrate with AIOps platforms and ITSM systems.
This is an excellent opportunity to work on challenging projects and develop your skills in a dynamic environment. You will receive:
- A competitive salary package.
- Ongoing training and professional development opportunities.
- A collaborative and supportive work environment.
We regret that only shortlisted candidates will be contacted.
Be The First To Know
About the latest Big data tools Jobs in Singapore !
Big Data Engineer
Posted today
Job Viewed
Job Description
Job Requirements:
· Degree in Information Technology or equivalent.
· Must have 6- 8 years of experience in Data Warehousing
· Extensive experience in ERD Design/ETL/Querying
· Must have at least 5 years of Python Development in Data Engineering
· Minimum 6 years of SQL experience
· Must have experience in Big Data Technologies- Apache, Spark, Hive or Kafka
· Experience in AWS cloud architecture and services and terraform for cloud
· Experience in CI/CD, Docker and Kubernetes
· Excellent written and verbal communication skills
Tell employers what skills you haveApache Spark
Scala
Kubernetes
Big Data
Pipelines
Hadoop
ETL
Information Technology
Data Engineering
SQL
Python
Docker
Java
Data Warehousing
Databases
Big Data Specialist
Posted today
Job Viewed
Job Description
We are seeking an accomplished data professional to develop and maintain cutting-edge data infrastructure. As a key member of our team, you will work with AI-driven confidence scoring and data engineering best practices to deliver robust, scalable, and secure data solutions.
Key Responsibilities
- Design and implement efficient ETL/ELT processes.
- Manage and maintain scalable data warehousing solutions.
- Deploy machine learning models into production environments.
- Ensure data quality, consistency, and integrity across diverse sources.
- Collaborate with software engineers to develop data-driven features.
Requirements
- At least 2 years of relevant experiences in data engineering
- Strong proficiency in Python.
- Experience with SQL, NoSQL, or Graph databases.
- Familiarity with GCP, AWS Bedrock, and Splunk.
- Experience with CI/CD setups.
- Strong foundation in algorithms, database structures, and integration strategies.
- Proven ability to design scalable data solutions.
Nice-to-Have / Preferred Skills
- Experience building scalable APIs and deploying ML/AI solutions.
- Understanding of event-driven architectures (e.g., Kafka, SQS).
- Exposure to graph data structures.
Benefits
- Collaborate with a cross-functional team of engineers, data scientists, and security experts.
- Opportunities for technical leadership and innovation.
We ensure the confidentiality of all applicants' information during the hiring process. By submitting your resume, you consent to the collection, use, and disclosure of your personal information as per our privacy policy.
Big Data Engineer
Posted today
Job Viewed
Job Description
Get AI-powered advice on this job and more exclusive features.
We are seeking a highly skilled and experienced Big Data Engineer to join our team. The ideal candidate will have a minimum of 5 years of experience managing data engineering jobs in big data environment e.g., Cloudera Data Platform. The successful candidate will be responsible for designing, developing, and maintaining the data ingestion and processing jobs. Candidate will also be integrating data sets to provide seamless data access to users.
Skills Set And Track Record
Good understanding and completion of projects using waterfall/Agile methodology
Analytical, conceptualisation and problem-solving skills
Good understanding of analytics and data warehouse implementations
Hands-on experience in big data engineering jobs using Python, Pyspark, Linux, and ETL tools like Informatica
Strong SQL and data analysis skills. Hands-on experience in data virtualisation tools like Denodo will be an added advantage
Hands-on experience in a reporting or visualization tool like SAP BO and Tableau is preferred
Track record in implementing systems using Cloudera Data Platform will be an added advantage
Motivated and self-driven, with ability to learn new concepts and tools in a short period of time
Passion for automation, standardization, and best practices
Good presentation skills are preferred
The developer is responsible to:
Analyse the Client data needs and document the requirements
Refine data collection/consumption by migrating data collection to more efficient channels
Plan, design and implement data engineering jobs and reporting solutions to meet the analytical needs
Develop test plan and scripts for system testing, support user acceptance testing
Work with the Client technical teams to ensure smooth deployment and adoption of new solution
Ensure the smooth operations and service level of IT solutions
Support production issues
Seniority level
Seniority level Mid-Senior level
Employment type
Employment type Contract
Job function
Job function Information Technology
Industries IT Services and IT Consulting
Referrals increase your chances of interviewing at Unison Consulting by 2x
Get notified about new Big Data Developer jobs in
Singapore, Singapore .
Data Engineer (Snowflake/Microsoft Fabric), AI & Data, Technology Consulting
South East Community Development Council, Singapore 1 week ago
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr