144 Big Data Technologies jobs in Singapore

AI/Data Science/Data Engineering Specialist

Arch Systems, LLC

Posted today

Job Viewed

Tap Again To Close

Job Description

workfromhome
    Job Title: Senior AI/Data Science/Data Engineering Specialist (Consulting to Full-Time)Location: Remote ( 6:30 PM IST to 3:30 AM IST)Job Type: Opportunity: Transition to Full-Time Based on Performance Start Date: Immediate Joiners PreferredAbout the Role:We are seeking a Senior AI/Data Science/Data Engineering Specialist for a high-impact role supporting advanced data initiatives. This role requires working during US business hours (India time: 6:30 PM IST to 3:30 AM IST).Key Responsibilities: Design, develop, and deploy AI/ML models and data pipelines in production-grade environments. Perform advanced data analysis, feature engineering, and exploratory data modeling. Build, optimize, and maintain ETL/ELT processes using large-scale datasets. Collaborate with cross-functional teams to identify data-driven solutions to business problems. Implement and maintain scalable machine learning pipelines. Ensure high code quality, maintainability, and documentation.Must-Have Skills: Strong expertise in Python for data science, scripting, and automation 5+ years of hands-on experience in AI/ML model development, data engineering, or data science Experience with Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, or similar libraries Proficient in SQL, data modeling, and cloud data tools (AWS, GCP, or Azure) Experience with Apache Airflow, Spark, or other big data tools is a plus Strong understanding of data warehousing, pipelines, and MLOps conceptsPreferred Qualifications: Prior experience in consulting engagements with US clients Familiarity with CI/CD pipelines and version control (Git) Excellent communication and problem-solving skills Ability to work independently in a remote, cross-time-zone environmentJob Types: Full-time, Part-time, Contractual / TemporarySchedule: Night shift US shiftExperience: Data science: 5 years (Required) AI: 1 year (Required)Shift availability: Night Shift (Required)Work Location: Remote,

Sign-in & see how your skills match this job

Find Your perfect Job

Sign-in & Get noticed by top recruiters and get hired fast

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Specialist - Data Engineering

Singapore, Singapore Nasdaq

Posted today

Job Viewed

Tap Again To Close

Job Description

As a member of the Internal Audit Analytics and Automation team, you will be responsible for developing, planning, and executing analytics and automation tools to evaluate internal controls across the organization, ensuring effectiveness, industry best practices, and regulatory requirements.

If you enjoy working in results-driven and high-performing international cultures and thrive in a creative and dynamic environment, this role is for you.

Responsibilities include:

  1. Leading the design and execution of advanced data analytics and automation initiatives to enhance audit effectiveness and efficiency.
  2. Collaborating with Internal Audit team members to collect, process, and analyze data to identify trends and insights.
  3. Partnering with cross-functional stakeholders to embed analytics into the audit lifecycle and risk assessment processes.
  4. Developing and maintaining reusable analytics assets such as scripts, dashboards, and automation templates.
  5. Evaluating emerging technologies (e.g., AI/ML, RPA) for integration into audit methodologies.
  6. Driving continuous improvement by identifying gaps and proposing data-driven enhancements.
  7. Presenting analytical findings and automation outcomes to senior leadership.
  8. Contributing to audit strategy development using data insights.

Qualifications:

  • Bachelor's degree in Analytics, Information Systems, Computer Science, Accounting, Finance, or related fields.
  • Up to 6 years of experience in data analysis, audit, or related areas.
  • Strong analytical and problem-solving skills.
  • Familiarity with programming and data analytics tools (Power BI, Alteryx, Power Automate, etc.).
  • Basic understanding of audit principles, internal controls, and risk management.
  • Experience with API development.
  • Excellent communication and partnership skills.
  • Data Analytics and Automation Certification preferred.

We are committed to inclusivity and provide reasonable accommodations for individuals with disabilities. Please contact us to request accommodations.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Data Engineering Manager

Singapore, Singapore SP Group

Posted today

Job Viewed

Tap Again To Close

Job Description

Join to apply for the Data Engineering Manager role at SP Group

Join to apply for the Data Engineering Manager role at SP Group

Get AI-powered advice on this job and more exclusive features.

Make the most of your talents and develop products that can create impact on a national scale. We are an in-house software team, assembled to move with speed and deliver with quality.

We Build Reliable Solutions. For Customers, Company and Country.

You will be part of the Digital Team and together, you will innovate, create, and deploy digital products that will empower more than 3,800 employees within SP Group and improve the quality of life for more than 1.7 million commercial, industrial and residential customers that SP Group serves. We build solutions that enable sustainable high-quality lifestyles and help consumers save energy and cost, as well as supporting national goals for a sustainable liveable city.

Now, imagine the impact you can create.

Job Summary

We are seeking a highly skilled and experienced Data Engineering Manager to join our dynamic data team. The ideal candidate will translate our strategic data vision into actionable solutions, with a balanced focus on hands-on technical delivery, end-to-end project leadership, and team enablement.

What You'll Do:

  • Lead discussions with business users and stakeholders to solicit and document requirements for data projects, including data usage, master data management, and AI initiatives.
  • Translate business needs into clear, actionable technical specifications and work breakdowns.
  • Lead sprint ceremonies and manage project roadmaps to drive on-time, on-budget delivery
  • Ensuring clear visibility into project milestones and resource utilization
  • Proactively identify and remove obstacles by coordinating with stakeholders and ensuring the team has clear direction, resources, and support to maintain delivery momentum
  • Lead the design, development, and optimization of scalable data pipelines, Master Data Management processes, and data services.
  • Define robust data models and implement data quality checks along with governance frameworks.
  • Encourage continuous learning by assigning real-world projects and case studies, then guiding reflective debriefs to reinforce lessons and foster independent problem-solving
  • Facilitate hands-on workshops and knowledge-sharing sessions to strengthen technical competencies and promote best practices across data operations
  • Establish and document best-practice run-books, decision logs, and process playbooks.
  • Identify and implement process enhancements, enforce CI/CD standards for data workflows, and maintain compliance with data-security and regulatory requirements.

What You’ll Need:

  • Experience in Data Engineering, Data Operations or closely related roles
  • Experience in leading requirements gathering sessions and translating business needs into technical specifications.
  • Strong proficiency in data management practices, including creating data dictionaries and designing data models.
  • Experience building and operating large scale data lake and data warehouse.
  • Experience with Hadoop ecosystem and big data tools, including Spark and Kafka
  • Experience with stream-processing systems including Spark-Streaming
  • Hands-on experience with UML and data specification development.
  • Experience with Agile & DevOps Practices would be advantageous
  • Knowledge in Utilities data standard such as IEC Common Information Model would be advantageous
  • Experience in AI and machine learning projects would be advantageous.
  • Domain knowledge in Utilities sector would be advantageous
  • Experience in Data Fabric and/or Data Mesh Architecture would be advantageous
  • Experience in Open Data-Lakehouse would be advantageous
  • Excellent communication and interpersonal skills, with the ability to convey complex technical plans into actionable updates.
  • Strong analytical and problem-solving abilities, with keen attention to detail.
  • Skilled in project-management tools and methodologies; uses data-driven insights to track progress, forecast resource needs, and inform decision-making

What We’ll Provide:

  • Opportunity to work on the cutting edge of digital engineering practices
  • Collaborative and fast-paced work environment
  • Be at the forefront of shaping our company's digital future
Seniority level
  • Seniority level Not Applicable
Employment type
  • Employment type Full-time
Job function
  • Job function Information Technology
  • Industries Utilities, Energy Technology, and IT System Data Services

Referrals increase your chances of interviewing at SP Group by 2x

Sign in to set job alerts for “Engineering Manager” roles. Regional Engineering Manager (Up to $16,000) Head of Engineering (Senior Manager) - RTSO Johor Bahru Manufacturing Engineering Manager (Machining) Senior Manager/ Manager, Manufacturing Engineering Data Engineering Director, Asian Bank- based in Vietnam Mechanical Engineering Manager (Operations & Maintenance) Manager - Engineering & Security (Department Head) Director, Platform & Application Engineering Head (AI Software Engineering), AI Products Mechanical Engineering Manager, Data Centre Head of R&D ( Machine Learning_Data Analysis_Retail Solution ) Engineering Manager (Machine Learning) - AICS Healthcare AI Solutions

We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Intern, Data Engineering

189773 $10 Monthly SUNTORY BEVERAGE & FOOD ASIA PTE. LTD.

Posted 3 days ago

Job Viewed

Tap Again To Close

Job Description

Position overview:
You will be resposible in assisting with data collection, analysis and reporting.


Key responsibilities:

  • Work closely with procurements datasets to build data visualisation and dashboards that inform strategic planning and sound business decisions
  • Understand functional and technical requirements from Procurements team to build reports that deliver actionable insights to key stakeholders
  • Assist the Procurement team to build and maintain data infrastructure (data pipelines, aggregated data sets, reports, dashboards) to facilitate development of key metrics to measure efficiency/impact/outcomes of day-to-day operations
  • Automate Procurement data collection and streamline existing Procurement manual processes to enhance operational efficiency.

Job Specifications (Criteria of Eligibility):

  • Currently pursuing a Bachelor's degree or higher, preferably major in Analytics, Mathematics, Statistics, Engineering, Computer Science or related field
  • Familiar with at least one programming/scripting language, preferably Python, SQL or Spark and familiar with Excel or Power BI
  • Experience with data modeling concepts, star schema and data vault
  • Experience in data analysis and data collection required
This advertiser has chosen not to accept applicants from your region.

Data Engineering (ETL & Modeling)

Singapore, Singapore Airwallex

Posted today

Job Viewed

Tap Again To Close

Job Description

About Airwallex

Airwallex is the only unified payments and financial platform for global businesses. Powered by our unique combination of proprietary infrastructure and software, we empower over 150,000 businesses worldwide – including Brex, Rippling, Navan, Qantas, SHEIN and many more – with fully integrated solutions to manage everything from business accounts, payments, spend management and treasury, to embedded finance at a global scale.

Proudly founded in Melbourne, we have a team of over 1,700 of the brightest and most innovative people in tech across more than 37 locations around the globe. Valued at US$5.6 billion and backed by world-leading investors including Sequoia, Lone Pine, Greenoaks, DST Global, Salesforce Ventures and Mastercard, Airwallex is leading the charge in building the global payments and financial platform of the future. If you’re ready to do the most ambitious work of your career, join us.

Hiring Location: Singapore and Shanghai Level: Mid level, Senior - Staff level Who We Are? About the Team:

As a global team, we span across Australia, China, USA, and Singapore, revolutionizing applied data science, data engineering and platform solutions to support Airwallex's rapid growth. You will collaborate with a diverse range of cross-functional partners, including Product, Engineering, Marketing, Sales, Finance, and more, to tackle complex data problems and shape the future of fintech.

What you’ll do

Part 1. Data Modeling

  • Design and implement robust and scalable data models that support business intelligence, machine learning, and operational needs.

  • Possess a deep understanding of data schemas and be able to select appropriate schema designs (e.g., star schema, snowflake, normalized vs denormalized) based on use cases.

  • Collaborate closely with business teams to translate their data needs into clean, structured, and well-documented models.

  • Understand and promote the concept of SSOT (Single Source of Truth) throughout the data layers and pipelines.

  • Maintain data consistency, traceability, and quality across multiple data sources and domains.

Part 2. ETL and Data Pipeline Management

  • Experience building and maintaining both batch and streaming ETL pipelines, with a strong understanding of end-to-end data workflow — from data ingestion to transformation and delivery.

  • Able to work closely with Data Platform Engineers (DPEs) and Product Managers (PMs) to quickly identify root causes of data issues and provide efficient, scalable solutions.

  • Bonus if you’ve worked with data across distributed or multi-datacenter systems, including solving challenges related to data migration, duplication, and consistency.

Part 3. Data Governance

  • Participate in and contribute to data governance strategies, policies, and standards.

  • Be familiar with any of the six key pillars of traditional data governance (e.g., data quality, data stewardship, metadata management, master data management, data privacy/security, data lifecycle).

Part 4. Data + AI

  • We hope you have a basic understanding of AI and enjoy thinking about how data engineering and AI can work together in practical and creative ways.

These four areas are the main focuses of this role. Ideally, you should be strong in at least one of them — especially data modeling or data ETL. If you also have experience or skills in the other areas, that would be a big plus.

Who you are

We're looking for people who meet the minimum qualifications for this role. The preferred qualifications are great to have, but are not mandatory.

Minimum qualifications:
  • Bachelor’s degree in Computer Science, Information Systems, Finance, Maths or a related field. A master's degree is a plus.

  • Proven experience (5+ years) in designing and implementing ETL processes using tools such as Informatica, Talend, Apache NiFi, or similar. If you have less than five years of work experience but have done very well in similar roles, we’re happy to consider you for a Mid-Level position (Data Engineer II). Just keep in mind that this role requires at least two years of work experience.

  • Strong expertise in data modeling.

  • Proficiency in SQL, database management systems (e.g., MySQL, PostgreSQL, Oracle), and data warehousing solutions.

  • Familiarity with Google Cloud Platform (GCP), specifically BigQuery and Airflow.

  • Excellent problem-solving skills, with a keen attention to detail and a commitment to producing high-quality work.

  • Strong communication and collaboration skills, with the ability to work effectively in a fast-paced, team-oriented environment.

  • Outstanding verbal communication skills for effective interaction with overseas teams.

Preferred qualifications:
  • Experience with financial industries( or financial applications, with a focus on Treasury and Ledger systems), payment systems, or fintech platforms.

  • Knowledge of data governance practices and regulatory requirements in the financial industry.

  • Experience with scripting languages (e.g., Python, R) for data analysis and automation.

  • Certification in data management or related technologies is a plus.

At Airwallex, you can make an impact in a rapidly growing, global fintech. We want you to share in our success, which is why you’ll be offered a competitive salary plus valuable equity within Airwallex. We also like to ensure we create the best environment for our people by providing collaborative open office space with a fully stocked kitchen. We organise regular team-building events and we give our people the freedom to be creative.

Equal opportunity

Airwallex is proud to be an equal opportunity employer. We value diversity and anyone seeking employment at Airwallex is considered based on merit, qualifications, competence and talent. We don’t regard color, religion, race, national origin, sexual orientation, ancestry, citizenship, sex, marital or family status, disability, gender, or any other legally protected status when making our hiring decisions. If you have a disability or special need that requires accommodation, please let us know.

Airwallex does not accept unsolicited resumes from search firms/recruiters. Airwallex will not pay any fees to search firms/recruiters if a candidate is submitted by a search firm/recruiter unless an agreement has been entered into with respect to specific open position(s). Search firms/recruiters submitting resumes to Airwallex on an unsolicited basis shall be deemed to accept this condition, regardless of any other provision to the contrary.

#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Technical Lead (Data Engineering)

Singapore, Singapore Infosys Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Technical Lead (Data Engineering) iCZ -Senior Consultant

  • Advanced degree in Computer Science, Data Engineering, or a related field.
  • Certifications in Snowflake or data engineering tools are a plus.
Detailed Job Description :
Qualifications
  • Advanced degree in Computer Science, Data Engineering, or a related field.
  • Certifications in Snowflake or data engineering tools are a plus.
Required Skills & Experience:
  • 10+ years of experience in data engineering with a proven track record of delivering large-scale data solutions.
  • Strong expertise in Snowflake, Advanced SQL, GBI ETL framework, and Python.
  • Hands-on experience in designing and implementing data pipelines, data integration, and data warehousing solutions.
  • Fluency in English and Mandarin (both spoken and written), enabling effective communication with global stakeholders.
  • Prior experience in stakeholder management, including requirements gathering, design playback, and presentations.
  • Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.
  • Strong problem-solving skills and the ability to think critically about data architecture and process optimization.
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.

Head of Data Engineering

Singapore, Singapore BTSE Holdings Limited

Posted today

Job Viewed

Tap Again To Close

Job Description

Join to apply for the Head of Data Engineering role at BTSE

About BTSE

Nogle SG is a specialized service provider dedicated to delivering a full spectrum of front-office and back-office support solutions, each of which are tailored to the unique needs of global financial technology firms. Nogle SG is engaged by BTSE Group to offer several key positions, enabling the delivery of cutting-edge technology and tailored solutions that meet the evolving demands of the fintech industry in a competitive global market.

BTSE Group is a leading global fintech and blockchain company that is committed to building innovative technology and infrastructure. BTSE empowers businesses and corporate clients with the advanced tools they need to excel in a rapidly evolving and competitive market. BTSE has pioneered numerous trading technologies that have been widely adopted across the industry, setting new benchmarks for innovation, performance, and security in fintech. BTSE’s diverse business lines serve both retail (B2C) customers and institutional (B2B) clients, enabling them to launch, operate, and scale fintech businesses. BTSE is seeking ambitious, motivated professionals to join our B2C and B2B teams.

About The Opportunity

We are looking for a Head of Data Engineering to lead the design and implementation of our data infrastructure pipeline, supporting our rapidly expanding crypto exchange operations. This role will focus on managing and architecting systems capable of handling massive volumes of real-time transaction data, user behavior data, market feeds, and other critical information for various business functions. You will work closely with cross-functional teams to ensure the availability, scalability, and cost-efficiency of the data systems.

The ideal candidate is a hands-on leader who can build and scale complex data infrastructures, optimize data storage and processing, and ensure that our systems are fast, secure, and compliant.

Responsibilities

  • Data Architecture and Infrastructure Design: Lead the design and architecture of scalable, high-performance data systems capable of storing and processing vast amounts of transactional data, user behavior logs, market data, and trading activity. Ensure systems are highly available and optimized for low-latency access.
  • Real-Time Data Processing: Architect and implement systems that handle high-throughput, real-time data streams from cryptocurrency transactions, market feeds, and user activities on the platform. Ensure data is ingested and processed with minimal delay to allow for timely decision-making and trading execution.
  • Data Integration and Management: Design and implement robust systems for ingesting, storing, and integrating data from various sources, including transaction data, user logs, trading activity, and external data feeds such as market data or news sources. Normalize and organize this data to support analytics and business intelligence needs.
  • User and Transaction Data Storage: Develop strategies for storing large volumes of user behavior data (e.g., transaction history, trading patterns, login activity) and transaction data (e.g., order books, trades, withdrawals). Ensure that data is accessible in a fast, secure, and cost-effective manner.
  • Data Governance, Quality, and Compliance: Establish strong data governance practices to ensure data integrity, quality, and compliance with regulations (e.g., GDPR, CCPA, AML/KYC requirements). Implement data validation, cleansing, and auditing processes to maintain high-quality and reliable datasets.
  • Cost Optimization: Design systems that balance performance with cost-efficiency, using techniques such as sharding, data partitioning, and compression. Ensure that the data infrastructure scales without unnecessary expense as the business grows.
  • Leadership and Team Management: Lead and mentor a team of data engineers, driving best practices, performance standards, and fostering a collaborative culture. Provide guidance and career development to help team members grow professionally.
  • Cross-Department Collaboration: Work closely with product, marketing, research, security, and compliance teams to ensure that data infrastructure meets the diverse needs of the business. Facilitate the development of analytical frameworks that support product innovation, user behavior analysis, and regulatory compliance.
  • Technology Evaluation and Selection: Continuously evaluate emerging data technologies, ensuring that the organization uses the most effective tools for managing transaction and user data. Ensure that the data infrastructure is aligned with the company’s strategic goals and technical requirements.
Requirements

  • Technical Leadership: Proven experience leading and managing data engineering teams, with a focus on building and scaling data infrastructures that support high-frequency, real-time transactions, user behavior analysis, and large-scale data storage.
  • Data Architecture Expertise: Strong background in designing systems that handle large-scale transactional data, real-time streaming data, and high-volume user behavior data. Expertise in data storage systems like distributed databases and data lakes is essential.
  • Real-Time Data Processing: Hands-on experience with real-time data processing frameworks (e.g., Apache Kafka, Apache Flink, Apache Spark Streaming) and the ability to design low-latency systems to process crypto transaction data in real-time.
  • Data Integration and Management: Experience integrating diverse data sources (e.g., transaction data, market data, user logs) into a unified platform while ensuring consistency, availability, and performance across systems.
  • Cost Optimization: Ability to balance performance with cost-efficiency when designing data storage and processing systems for high-volume data, including implementing data partitioning, sharding, and compression techniques.
  • Security and Compliance Expertise: Strong understanding of data security practices, including encryption, access control, and compliance with financial regulations (e.g., AML/KYC). Familiarity with cryptocurrency-specific regulations is a plus.
  • Data Governance & Quality Control: Proven ability to implement data governance processes that ensure the integrity and quality of large datasets. Experience with data lineage, validation, and auditing processes.
  • Data Storage & Management Techniques: Experience with high performance distributed database management and efficient storage strategies.
  • Programming Languages: Proficiency in languages like Python, Java, or Scala for data engineering tasks. Experience with SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
  • Distributed Systems & Performance Optimization: Knowledge of distributed computing systems for big data storage and processing (e.g., Apache Hadoop, Apache Cassandra, HBase, or custom sharded architectures). Experience in optimizing systems for performance under heavy load.
  • Scalability & High Availability: Experience designing and maintaining scalable, high-availability data infrastructure for mission-critical applications with a focus on minimizing downtime and latency.
  • Containerization and Orchestration: Familiarity with containerization (Docker) and orchestration platforms (Kubernetes) to manage microservices and large-scale data workflows.
  • Cross-Functional Collaboration & Stakeholder Management: Ability to collaborate across different teams (e.g., product, compliance, research) to ensure the data infrastructure aligns with the business’s strategic needs and regulatory requirements.
  • Data Modeling & Metadata Management: Expertise in designing data models optimized for querying and analytics. Experience managing metadata to ensure smooth access to structured data for internal teams.
Nice To Haves

  • A degree in Computer Science, Engineering, Mathematics, or a related field.
  • Experience working in a financial, trading, or cryptocurrency environment.
  • Familiarity with blockchain technologies and cryptocurrency trading platforms.
  • Strong understanding of data privacy and security standards relevant to financial services and crypto exchanges (e.g., PCI DSS, GDPR).
  • Experience with distributed ledgers or blockchain technologies.
Perks And Benefits

  • Competitive total compensation package
  • Various team building programs and company events
  • Comprehensive healthcare schemes for employees and dependants
  • And many more! Apply and let us tell you more!
Seniority level
  • Mid-Senior level
Employment type
  • Full-time
Job function
  • Engineering and Information Technology
#J-18808-Ljbffr
This advertiser has chosen not to accept applicants from your region.
Be The First To Know

About the latest Big data technologies Jobs in Singapore !

Intern, AI & Data Engineering

Singapore, Singapore The Coca-Cola Company

Posted 14 days ago

Job Viewed

Tap Again To Close

Job Description

**Location:** Singapore
**Minimum Internship Commitment Duration:** 3-4 months
**Work Schedule:** Full-time minimum of 5-days per week, 8AM - 5PM, with occasional evening meetings
**Internship Job Overview**
As a manufacturing company, The Coca-Cola Company is continuously looking for ways to improve better accuracy in upstream (Demand) and downstream (Go-To-Market Outlet level) forecast. With this opportunity, we plan to evolve and develop new models add to current list of forecast models which will elevate our predictability to next level. We are looking for talent who are passionate about data modelling with a strong analytical mind and great business sense.
**What You'll Do**
As an AI & Data Engineering Intern, you will:
* Assist in data acquisition and requirements gathering for new AI initiatives.
* Collaborate on enhancing existing workflows to improve accuracy and performance.
* Support the development of AI-powered data products using agentic frameworks and multi-agent systems.
* Work with LLMs and AI agents within frameworks like LangChain, LlamaIndex, and AutoGen.
* Contribute to the development of solutions leveraging Azure AI Foundry, MCP (Model Context Protocol), and A2A (Agent-to-Agent) frameworks.
**Who You Are**
We're looking for a curious, driven, and collaborative intern who is:
* Currently pursuing a Bachelor's or Master's degree in Data Science, Computer Science, Engineering, Statistics, or a related field.
* Experienced in Python through coursework, personal projects, or hackathons.
* Familiar with cloud infrastructure concepts, especially within the Azure ecosystem.
* Knowledgeable in data engineering, analytics, and ETL processes.
* Comfortable with SQL (basic to intermediate level).
* Aware of foundational AI concepts, including LLMs, AI agents, and multi-agent systems.
* A strong communicator with excellent organizational and problem-solving skills.
**Functional Skills**
We are keen to test out new approach & feature engineering which might improve the way we do forecast including geo-spatial analytics, which includes:
+ Azure cloud
+ Python
+ Machine Learning
+ Data Wrangling and ETL
+ Statistics
+ Geo-Spatial analytics skillset is a plus
+ Advanced level in SQL
+ Analytical mind and great business sense
+ Advance analytics
+ Automation
+ Others
**Internship Details:**
+ The internship is scheduled for a duration of 6 months
+ The role requires a full-time commitment, minimum of 5-days per week. As we work with global teams, working hours may need to accommodate US and UK time zones for ad-hoc meetings.
**What We Offer:**
+ Mentorship from experienced professionals.
+ Opportunity to work on real projects and make meaningful contributions.
+ Networking opportunities within the industry.
+ Flexible working hours.
+ A collaborative and dynamic work environment.
**What can help you be successful in the role?**
**What We Can Do for You**
+ Our Purpose is 'TO REFRESH THE WORLD. MAKE A DIFFERENCE' this role will allow you to have a direct impact on how we improve our revenue, share, and execution in the marketplace.
+ Working for a globally networked organization committed scaling best practices and ideas there is no limit to the impact you can have or the network of colleagues you can build.
+ As part of the Digital & Technology organization you will empower the enterprise and the system by building connected digital and data platforms for the future.
**Growth Behavior:**
+ Growth Mindset: Demonstrates Curiosity. Welcomes failure as a learning opportunity.
+ Smart Risk: Makes bold decisions/recommendations.
+ Externally Focused: Understands the upstream and downstream implications of his/her work.
+ Performance Driven & Accountable: Has high performance standards. Outperforms her/his peers.
+ Fast/Agile: Removes barriers to move faster. Experiments and adapts. Thrives under pressure and fast pace.
+ Empowered: Brings solutions instead of problems. Challenges the status quo. Has the courage to take an unpopular stance.
**Skills:**
organization
We are an Equal Opportunity Employer and do not discriminate against any employee or applicant for employment because of race, color, sex, age, national origin, religion, sexual orientation, gender identity and/or expression, status as a veteran, and basis of disability or any other federal, state or local protected class.
This advertiser has chosen not to accept applicants from your region.

Lecturer (Data Analytics & Data Engineering)

828608 $12000 Monthly DIGIPEN INSTITUTE OF TECHNOLOGY SINGAPORE PTE. LTD.

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Position

Full-time Lecturer (Data Analytics & Data Engineering) in the Department of Continuing Education, DigiPen Institute of Technology Singapore.

Description

We are seeking a seasoned professional to teach and develop a combined curriculum covering the end-to-end data lifecycle—from data collection and preprocessing through analytics, visualization, pipeline design, and big-data engineering. The successful candidate will design and deliver lectures, hands-on labs, and project-based assignments to equip students with practical skills and theoretical foundations.

Key Responsibilities

· Develop and deliver course materials covering core concepts and practical applications in data analytics, data engineering, machine learning, and related areas of data science.

· Design and grade practical lab exercises and real-world projects.

· Mentor and provide feedback to students on assignments and projects.

· Collaborate with industry partners to keep curriculum aligned with current best practices.

· Assess student performance and maintain accurate records.

· Participate in curriculum review and continuous improvement initiatives.

Required Qualifications

· Bachelor’s or Master’s degree in Computer Science, Data Science, Engineering, or a related field.

· A minimum of 3 years of industry experience in data analytics and data engineering roles for candidates with a Master’s degree, or 5 years of experience for those with a Bachelor’s degree

· ACLP certification is preferred

· Proficiency in Python, SQL/NoSQL databases, Apache Spark, and data pipeline tools (e.g., Kafka).

· Experience with data visualization tools (e.g., Tableau, Power BI) and cloud platforms (AWS, Azure).

· Strong communication and mentoring skills.

· Passion for teaching and ability to engage learners.


Application: Review of applications will begin immediately and continue until the position is filled. For a complete application, please email the following documents:

· A letter of interest,

· Curriculum vitae,

· Transcript of highest degree earned, and

· Portfolio showcase (if available)


Electronic submission is required with documents in Word or PDF format attached to an email with a subject line “Full-time Lecturer (Data Analytics & Data Engineering) Position in CE” sent to:


Ms. Priyanka Bhoyar

Program Manager for Continuing Education

Email: ;


You will be required to present your original transcripts and certificates if you are shortlisted for an interview, which is an integral part of the hiring process.

This advertiser has chosen not to accept applicants from your region.

Data Engineering & Regulatory Technology Director

609966 Boon Lay Way, Singapore $16000 Monthly TEAMLEASE DIGITAL CONSULTING PTE. LTD.

Posted 1 day ago

Job Viewed

Tap Again To Close

Job Description

Role Overview:
We are seeking a highly experienced senior Data Engineering & Regulatory Technology specialist to lead the design, development, and implementation of big data platforms and regulatory reporting systems across the APAC region. The ideal candidate must bring a robust mix of technical leadership, architecture strategy, project management, and cross-border delivery expertise, particularly in banking or related industries.

Key Responsibilities:

  • Architect scalable data pipelines using Hadoop, Spark, Kafka, Flink, Hive, and real-time stream processing frameworks.
  • Oversee full-stack delivery of applications and services related to regulatory reporting, including integration with systems such as SBV Web Services.
  • Lead DevOps implementation (CI/CD, automated testing, containerized environments).
  • Translate APAC regulatory requirements into technical designs and delivery solutions.
  • Manage end-to-end reporting infrastructure, ensuring delivery of accurate, real-time compliance data to financial authorities (e.g., AMLregulations).
  • Lead multi-disciplinary teams across geographies (devs, BAs, ops), managing performance, talent development, and resource planning.
  • Design and implement team growth strategies, skill development plans, and performance incentives.
  • Own full-lifecycle delivery: scope, schedule, risk, and resource management.
  • Align business and technical goals, coordinating with stakeholders in risk, compliance, and infrastructure teams.
  • Drive agile workflows and use project tools to ensure transparency and high-quality outcomes.

Required Qualifications:

  • 15+ years of experience in data engineering, big data systems, or enterprise software development, preferably within banking, FinTech, or related finnancial regulatory domains
  • Deep experience with Hadoop, Spark, Flink, Kafka, Hive, Scala, Tableau, SQL, and cloud environments (AWS/Aliyun)
  • Strong leadership experience managing cross-regional teams (10–20+)
  • Proven success in delivering regulatory tech, compliance platforms, or real-time data streaming systems

Preferred Qualifications:

  • Prior experience working with regulators in APAC
  • Certifications in Web, Database, or J2EE Engineering
This advertiser has chosen not to accept applicants from your region.
 

Nearby Locations

Other Jobs Near Me

Industry

  1. request_quote Accounting
  2. work Administrative
  3. eco Agriculture Forestry
  4. smart_toy AI & Emerging Technologies
  5. school Apprenticeships & Trainee
  6. apartment Architecture
  7. palette Arts & Entertainment
  8. directions_car Automotive
  9. flight_takeoff Aviation
  10. account_balance Banking & Finance
  11. local_florist Beauty & Wellness
  12. restaurant Catering
  13. volunteer_activism Charity & Voluntary
  14. science Chemical Engineering
  15. child_friendly Childcare
  16. foundation Civil Engineering
  17. clean_hands Cleaning & Sanitation
  18. diversity_3 Community & Social Care
  19. construction Construction
  20. brush Creative & Digital
  21. currency_bitcoin Crypto & Blockchain
  22. support_agent Customer Service & Helpdesk
  23. medical_services Dental
  24. medical_services Driving & Transport
  25. medical_services E Commerce & Social Media
  26. school Education & Teaching
  27. electrical_services Electrical Engineering
  28. bolt Energy
  29. local_mall Fmcg
  30. gavel Government & Non Profit
  31. emoji_events Graduate
  32. health_and_safety Healthcare
  33. beach_access Hospitality & Tourism
  34. groups Human Resources
  35. precision_manufacturing Industrial Engineering
  36. security Information Security
  37. handyman Installation & Maintenance
  38. policy Insurance
  39. code IT & Software
  40. gavel Legal
  41. sports_soccer Leisure & Sports
  42. inventory_2 Logistics & Warehousing
  43. supervisor_account Management
  44. supervisor_account Management Consultancy
  45. supervisor_account Manufacturing & Production
  46. campaign Marketing
  47. build Mechanical Engineering
  48. perm_media Media & PR
  49. local_hospital Medical
  50. local_hospital Military & Public Safety
  51. local_hospital Mining
  52. medical_services Nursing
  53. local_gas_station Oil & Gas
  54. biotech Pharmaceutical
  55. checklist_rtl Project Management
  56. shopping_bag Purchasing
  57. home_work Real Estate
  58. person_search Recruitment Consultancy
  59. store Retail
  60. point_of_sale Sales
  61. science Scientific Research & Development
  62. wifi Telecoms
  63. psychology Therapy
  64. pets Veterinary
View All Big Data Technologies Jobs