126 Data Developer jobs in Singapore
Big Data Developer
Posted today
Job Viewed
Job Description
My Client
It is a global cryptocurrency exchange dedicated to providing secure, efficient, and innovative digital asset trading services. The platform offers a wide range of services, including spot trading, derivatives trading, and wealth management. They are looking for experienced and motivated Mobile app developer to join the dynamic team. The role involves working closely with other developers, designers, and product managers to create high-quality, user-friendly mobile apps that deliver exceptional user experiences.
Key Responsibilities:
- Develop scalable data processing architectures using Hadoop, Flink, and other Big Data tools.
- Implement and optimize data ingestion, transformation, and analysis pipelines.
- Design efficient data storage and retrieval mechanisms to support analytical needs.
- Work closely with business stakeholders to translate requirements into technical solutions.
- Ensure data quality, security, and compliance with industry standards.
- Identify and resolve performance bottlenecks in data processing systems.
- Keep up with advancements in Big Data technologies and recommend improvements.
Requirements:
- Degree in Computer Science, Information Technology, or a related field.
- 3+ years of hands-on experience in Big Data engineering.
- Proficiency in Hadoop, Flink, Kafka, and related frameworks.
- Experience with database modeling and large-scale data storage solutions.
- Strong understanding of data security best practices.
- Excellent analytical and troubleshooting skills.
- Effective communication and teamwork capabilities.
- A proactive mindset with a passion for data-driven innovation.
If it sounds like your next move, please don't hesitate to apply. Kindly note that only shortlisted candidates will be contacted. Appreciate your understanding. Data provided is for recruitment purposes only.
About Us
Dada Consultants was established in 2017, with the commitment of providing the best recruitment services in Singapore. We are comprised of a dynamic head-hunting team dedicated to sourcing for highly competent professionals in IT industry. We provide enterprises with customized talent solutions, and bring talents to career advancement.
EA Registration Number: R
Business Registration Number: W. Licence Number: 18S9037
Senior Data Developer
Posted today
Job Viewed
Job Description
Before you apply to a job, select your language preference from the options available at the top right of this page.
Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level.
Job Description:
Job Title: Senior Data Developer – Azure ADF and Databricks
Experience Range: 8-12 Years
Location: Chennai, Hybrid
Employment Type: Full-Time
About UPS
UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation.
About UPS Supply Chain Symphony
The UPS Supply Chain Symphony platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making.
About the role
We are seeking an experienced Senior Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks, and Cosmos DB. The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. The Senior Data Developer will work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives.
The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. This role also involves coaching junior developers, conducting code reviews, and driving strategic improvements in data architecture and design patterns.
Key Responsibilities
- Data Solution Design and Development:
- Design and develop scalable and high-performance data pipelines using Azure Data Factory (ADF).
- Implement data transformations and processing using Azure Databricks.
- Develop and maintain NoSQL data models and queries in Cosmos DB.
- Optimize data pipelines for performance, scalability, and cost efficiency.
- Data Integration and Architecture:
- Integrate structured and unstructured data from diverse data sources.
- Collaborate with data architects to design end-to-end data flows and system integrations.
- Implement data security, governance, and compliance standards.
- Performance Tuning and Optimization:
- Monitor and tune data pipelines and processing jobs for performance and cost efficiency.
- Optimize data storage and retrieval strategies for Azure SQL and Cosmos DB.
- Collaboration and Mentoring:
- Collaborate with cross-functional teams including data testers, architects, and business analysts.
- Conduct code reviews and provide constructive feedback to improve code quality.
- Mentor junior developers, fostering best practices in data engineering and cloud development.
Primary Skills
- Data Engineering: Azure Data Factory (ADF), Azure Databricks.
- Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB).
- Data Modeling: NoSQL data modeling, Data warehousing concepts.
- Performance Optimization: Data pipeline performance tuning and cost optimization.
- Programming Languages: Python, SQL, PySpark
Secondary Skills
- DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation.
- Security and Compliance: Implementing data security and governance standards.
- Agile Methodologies: Experience in Agile/Scrum environments.
- Leadership and Mentoring: Strong communication and coaching skills for team collaboration.
Soft Skills
- Strong problem-solving abilities and attention to detail.
- Excellent communication skills, both verbal and written.
- Effective time management and organizational capabilities.
- Ability to work independently and within a collaborative team environment.
- Strong interpersonal skills to engage with cross-functional teams.
Educational Qualifications
- Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field.
- Relevant certifications in Azure and Data Engineering, such as:
- Microsoft Certified: Azure Data Engineer Associate
- Microsoft Certified: Azure Solutions Architect Expert
- Databricks Certified Data Engineer Associate or Professional
About the Team
As a Senior Data Developer, you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications.
Employee Type:
Permanent
UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.
Senior Java Big/Data developer in investment banking
Posted today
Job Viewed
Job Description
What You'll Do:
- Lead technical design for a game-changing Market Risk Intelligence project
- Architect solutions using modern big data stack (Spark, Kafka, Hadoop)
- Drive real-time processing and automated reporting capabilities
- Mentor team members and champion best practices
- Collaborate with global teams across 33+ countries
What We're Looking For:
- 8+ years software development experience
- 5+ years hands-on Java/J2EE expertise
- Strong big data skills: Spark Streaming, Kafka, Hadoop ecosystem
- Experience with CI/CD, testing frameworks, and agile methodologies
- Leadership mindset with passion for knowledge sharing
temenos data source developer
Posted today
Job Viewed
Job Description
TEMENOS DATA SOURCE DEVELOPER
Important Information
Location: Singapore
JOB DESCRIPTION
Develop and parametrize based on business requirements or operational depth in our application FIMaster based on Temenos Data source (TDS)
Setup new interfaces, data providers, import and exports including the necessary data mapping and rule setup in TDS.
Define and execute data migrations.
Execute complex data queries for analysis and data corrections
Develop and support bash and python scripts.
Support business requests in daily work with analysis, solutions/advice and corrections if necessary.
Monitor the daily processes and support correction of incidents as L3 support
Setup and execution of release deployments
REQUIREMENT
Higer education in IT (University of applied sciences etc.) or Advanced Federal Diploma of Higher Education in business informatics
4-5 years' experience in a similar role within the financial industry
Very good understanding of software design / development / customization processes
Sound experience in parametrisation in Temenos Data Source (TDS) incl. setup of new provider, import, export and complex rules with use of plugins.
Sound experience of the TDS data model and how to setup a decoupling of a provider.
Experience with different external data providers mainly SIX and Bloomberg
Familiar with Oracle-DB, SQL and PL-SQL
Familiar with Python and Pandas
Experience in Unix System Management and JBoss
Experience in Configuration Management and Deployment Automation especially with Bitbucket, Nexus, Jenkins, Octopus
Familiar to work in an agile environment with Jira
Experience in incident management and 2nd/3rd level support
Experience with requirements engineering and translation of business requirements into IT solution design
Capability to analyze complex and poorly defined problems related to the application and to provide solutions independently
About Encora
Encora is a global company that offers Software and Digital Engineering solutions. Our practices include Cloud Services, Product Engineering & Application Modernization, Data & Analytics, Digital Experience & Design Services, DevSecOps, Cybersecurity, Quality Engineering, AI & LLM Engineering, among others.
At Encora, we hire professionals based solely on their skills and do not discriminate based on age, disability, religion, gender, sexual orientation, socioeconomic status, or nationality
Data Visualization Developer II
Posted today
Job Viewed
Job Description
Designs and develops methods, processes, and systems to consolidate and analyze structured and unstructured, diverse sources including "big data" sources. Develops and uses advanced software programs, algorithms, querying and automated processes to cleanse, integrate and evaluate datasets and models complex business problems. Is familiar with disciplines such as Natural Language Processing, Machine Learning, Predictive modeling, Statistical Analysis and Hypothesis testing. Works with cross-discipline teams in order to ensure connectivity between various databases and systems. Identifies meaningful insights and interprets and communicates findings and recommendations. May develop information tools, algorithms, dashboards, and queries to monitor and improve business performance. Maintains awareness of emerging analytics and big-data technologies.
We are Seeking a talented and experienced Data Visualization Developer to join our Program Integration and Support Team. The ideal candidate will be responsible for developing dashboards and reports using Power BI to meet customer business needs. These dashboards and reports will be distributed to high level Federal Government Executives to assist them with decisions regarding future National Airspace System (NAS) infrastructure and equipment acquisitions and deployment.
Responsibilities:
- Develop Dashboards and Reports using Power BI to meet customer business needs.
- Provide consultation and suggest best approaches for the development of new applications.
- Responsible for Data Collection Tool Setup, Maintenance and Administration
- Responsible to data mine information from FAA databases to determine trends and seek
- Responsible for Ensure Data Collection/Reporting Validity
- Responsible for Conduct Analysis of textual data
- Responsible for assisting with Annual sample criteria development
- Work closely with a high-energy development team to design, develop, troubleshoot, and debug software programs.
- Responsible for Lessons Learned, Trend and Root Cause analysis
- Responsible to compare the results of Quality Review Point (QRP) assessments across project type
- Responsible for Statistical Analysis
- Responsible for Develop and Distribute Reports (Monthly, Quarterly & Annual) (Program & Senior Level)
- Responsible for (Monthly, Quarterly & Annual) Rollup Reporting of Results and Recommendation to Senior Leadership
- Responsible for (Monthly, Quarterly & Annual) Lessons Learned Collection/Distribution Tool Setup and Administration
Requirements:
Required Qualifications:
- Must be able to obtain and maintain a Public Trust Clearance.
- Typically requires BS and 2 – 4 years of prior relevant experience or Masters with less than 2 years of prior relevant experience.
- Must possess excellent oral and written with proficiency in MS Office products (Word, Excel, PowerPoint).
- Must be a proactive, flexible, team player and adjust to multiple tasking.
- Demonstrated ability to exercise initiative, self-motivation and follow-up.
- Must have superior communication, interpersonal and customer service skills.
Desired Qualifications:
- Develop software systems and programs using Power BI, Python or PHP (e.g., Python, PHP ASP.NET, ColdFusion) for software enhancements, new systems, and products.
- Utilize HTML, JavaScript, jQuery, CSS, and Oracle, PostgreSQL, MySQL databases for system development.
- Knowledge of Generative AI (GenAI) and Large Language Models (LLMs) to drive a new era of productivity and connectivity within the Federal Government. Experience with teams working to discover techniques for deploying Machine Learning (ML) models with the AI data cloud.
- FAA Facility, Service Area, or Headquarters Managerial or Support Experience.
- Demonstrated ability to work independently, requiring minimal guidance and support.
- Be able to juggle overlapping priorities and meet tight deadlines.
- Strong analytical skills to support variance reporting and quick report turn around on ad hoc / on demand basis as required by the Program Managers and leadership.
- Well-organized, schedule and quality driven, and a proactive multi-tasker.
Automation Skills:
- Advanced working knowledge of Microsoft Office Suite (with a focus on Power BI) and Excel and other general-purpose business management software.
- High-level of proficiency with editing, graphic design, presentation support, multimedia and web support.
- High-level of proficiency with editing, graphic design, visual presentation, multimedia, and web support using state-of-the-art software and applications.
- Proficiency in Excel, Microsoft Power BI and MS Office Suite.
Administrative Skills:
- Performs administrative and management support services for a wide range of inter and intra-office coordination and interaction. Provides administrative support in the management of financial data collection, documentation, operations and technical programs. Position requires substantial coordination and interfacing with client program offices.
- Support Functions: Supports the FAA Customer in the following areas:
- Maintaining record, data and information on all current Task Orders/Contracts including but not limited to contract number, assigned Program Managers, COR/TOR/ETO/CO, Task Order funds status / dollar amounts, periods of performance, run-out dates.
- Develop and produce financial reports consisting of detailed contracting information that can be updated expeditiously and concurrently in order to satisfy requests from senior management. Keep all contractual data current and track all task orders/ contracts about their renewals, expiries, re-competes and notifies ETO/TOR in a timely manner, to facilitate timely and prompt actions by ETO/TOR.
- Maintains data current and produce accurate documentation to communicate status, changes to budget, schedule, and financials. Ensuring that all changes and outcomes are clearly presented. Perform data validation and implement quality check to avoid errors.
- Monitors budgeted-vs-actual expenditure of funds through continuous review and examination of accounting records, budgetary generated purchase requests, and knowledge of budgetary functions and coordination with the Technical Operations acquisition staff. Maintain electronic files for limited acquisition documents i.e., purchase requests, telework logs, travel log etc. Active collaboration with the ETO/TOR and AJW-2410 Procurement/Acquisition staff.
- Create, maintain and report/update Task Order Dashboard summarizing current procurement, budget allocations, commitments, obligations, and current status of contract actions underway with the Procurement Office. Creates and reports other financial metrics as requested with the intended audience being senior management. Prepares presentations, proposals, and a wide variety of other supporting project documentation.
- Works with contractors and government counterpart in tracking SOW/PWS/SOO contractor labor hours/tasks to input and continuously update the dashboard. This includes the coordination of monthly contract status reviews (strategy, budget, performance, etc.) with ETO/TOR, including regular interaction with contractors and government staff to update the system. Attends and/or coordinates meetings to capture baselines and track changes.
- Monitor day-to-day day activities and functional changes that may impact requirements and budgets, analyze program resource requirements, and budget data/information and documentation. Report findings, recommend any needed corrective actions or proposed solutions, and assist in the execution of accepted recommendations or solutions.
Senior Data Systems Developer
Posted today
Job Viewed
Job Description
A Data Engineering position is available for an ambitious professional seeking to develop scalable ETL pipelines and data architectures.
Our ideal candidate will design, implement, and maintain large-scale data systems to support high-performance data storage and retrieval. This involves integrating data from various sources into centralized data warehouses or data lakes using cloud-based services like AWS, Azure, or GCP.
As a collaborative team member, you will work closely with cross-functional teams to deliver data solutions that adhere to industry best practices for data governance, quality, and security.
Responsibilities include developing and maintaining data models, schemas, and metadata documentation to facilitate efficient data management.
Key Requirements:
- Bachelor's or Master's Degree in Computer Science, Information Technology, or related field.
- 5–7 years of hands-on experience in Data Engineering or a related role.
- Strong proficiency in SQL and experience with relational and non-relational databases.
- Expertise in data pipeline tools such as Apache Airflow, Kafka, Spark, NiFi, or Talend.
- Strong programming skills in Python, Java, or Scala for data manipulation and automation.
Senior Data Platform Developer
Posted today
Job Viewed
Job Description
Data Engineer Job Opening
We are seeking a skilled Data Engineer to design and implement data architecture, pipelines and ETL processes.
Key Responsibilities:
- Design and develop efficient data architecture, pipelines and ETL processes.
- Develop and maintain robust data platforms, dashboards to support analytics and reporting.
- Evaluate and ensure high-quality, reliable, and available data across all data platforms.
- Develop and implement Infrastructure as Code (IAC) and Continuous Integration/Continuous Deployment (CICD) to streamline and automate the deployment process.
- Collaborate with cross-functional teams, data scientists, analysts, and stakeholders to understand data needs and deliver solutions.
- Optimize data storage and retrieval processes for efficiency and performance.
- Provide Level 3 support for in-house development work and data platform/warehouse.
Requirements:
- Bachelor's degree in Computer Science, Information Systems, or related field.
- At least 2-5 years of experience in data engineering or related roles.
- Proven expertise in building data platforms in Cloud.
- Strong proficiency in SQL, Python, and other relevant programming languages.
- Familiarity with data modeling, ETL processes, and data warehousing concepts is required.
- Excellent problem-solving skills and attention to detail.
- Effective communication skills and ability to work collaboratively in a team environment.
Be The First To Know
About the latest Data developer Jobs in Singapore !
Temenos Data Source Developer
Posted today
Job Viewed
Job Description
Responsibilities
- Develop and parametrize based on business requirements or operational depth in our application FI-Master based on Temenos Data source (TDS)
- Setup new interfaces, data providers, import and exports including the necessary data mapping and rule setup in TDS.
- Define and execute data migrations.
- Execute complex data queries for analysis and data corrections
- Develop and support bash and python scripts.
- Support business requests in daily work with analysis, solutions/advice and corrections if necessary.
- Monitor the daily processes and support correction of incidents as L3 support
- Setup and execution of release deployments
REQUIREMENTS
- 4-5 years' experience in a similar role within the financial industry
- Very good understanding of software design / development / customization processes
- Capability to analyze complex and poorly defined problems related to the application and to provide solutions independently
- Excellent analytical skills
- Basic knowledge of Financial Instruments Technical
- Sound experience in parametrisation in Temenos Data Source (TDS) incl. setup of new provider, import, export and complex rules with use of plugins.
- Sound experience of the TDS data model and how to setup a decoupling of a provider.
- Experience with different external data providers mainly SIX and Bloomberg
- Familiar with Oracle-DB, SQL and PL-SQL
- Familiar with Python and Pandas
- Experience in Unix System Management and JBoss
- Experience in Configuration Management and Deployment Automation especially with Bitbucket, Nexus, Jenkins, Octopus
- Familiar to work in an agile environment with Jira
- Experience in incident management and 2nd/3rd level support
Experience with requirements engineering and translation of business requirements into IT solution design
Tell employers what skills you haveBloomberg
Pandas
Analytical Skills
Translation
Unix
TDS
SQL
JIRA
Configuration Management
Python
Software Design
Incident Management
Mapping
Business Requirements
Senior Data Pipeline Developer
Posted today
Job Viewed
Job Description
Job Description:
As a key member of our infrastructure team, you will lead the development of critical pipelines and workflows that enable rapid model training, validation, and deployment. Your mission is to own and operate data pipelines, design and implement MLOps workflows, and partner with modeling engineers to accelerate their research.
- Develop scalable data ingestion and processing solutions for robotics datasets.
- Implement CI/CD pipelines for model training, data versioning, and large-scale validation.
- Create tools and services that accelerate model research and development.
- Maintain high development standards through code and design reviews.
Required Skills and Qualifications:
At least 5 years of experience in Robotics, Autonomous systems, Machine Learning, Statistics, Applied Mathematics, or a related field is required.
A deep understanding of machine learning, deep learning, data mining, algorithmic foundations of optimization, and knowledge of model compression, quantization, and inference techniques are must-haves.
Benefits:
Work alongside experienced professionals who share your passion for innovation.
Enjoy opportunities for professional growth and development.
Stay up-to-date with the latest advancements in the field.
Others:
This role requires strong collaboration and communication skills.
Senior Python Developer for Data Engineering
Posted today
Job Viewed
Job Description
We're seeking an experienced Python Backend Engineer to join our high-performing team. As a key contributor, you'll play a pivotal role in developing robust systems that process and transform market data at scale.
This is an exceptional opportunity for individuals with a strong background in backend development using Python. Your expertise will help shape the future of our data engineering environment.
Key Responsibilities:
- Design and implement high-quality Python services for data processing and transformation
- Collaborate with tech leads and cross-functional teams to drive business priorities forward
- Participate in code reviews and contribute to technical decision-making
- Evaluate and ensure performance, reliability, and scalability of backend systems
- Apply modern Python tooling and best practices (TDD encouraged)
- Maintain and improve existing codebases
Skill Requirements:
- 7+ years of experience in backend development with a focus on Python
- Excellent knowledge of testing methodologies (TDD, unit/integration testing)
- Experience with Python packaging tools like Poetry or UV
- Familiarity with typed Python, type hints, and code readability standards
- Ability to write clear, maintainable, and well-documented code
Nice-to-Have Skills:
- Experience with Kubernetes or CI/CD workflows
- Exposure to observability tools like Datadog
- Active contributor to open-source projects (e.g., GitHub profile)
- Knowledge of Java or Go (a bonus if paired with strong Python fundamentals)