318 Big Data Technologies jobs in Singapore
25886255 FRMT Application Support Intermediate Analyst (Unix, SQL, Big Data technologies)
Posted 3 days ago
Job Viewed
Job Description
Whether you’re at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you’ll have the opportunity to expand your skills and make a difference at one of the world’s most global banks. We’re fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You’ll also have the chance to give back and make a positive impact where we live and work through volunteerism.
Shape your Career with Citi
Citibank serves as a trusted advisor to our retail, mortgage, small business and wealth management clients at every stage of their financial journey. Through Citi's Access Account, Basic Banking, Citi Priority, Citigold and Citigold Private Client, we offer an array of products, services and digital capabilities to clients across the full spectrum of consumer banking needs worldwide.
We’re currently looking for a high caliber professional to join our team as FRMT Application Support Intermediate Analyst based in Singapore. Being part of our team means that we’ll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance:
We are seeking an experienced candidate to join our Production/Application support team. The ideal candidate will bring a blend of strong technical skills Unix, SQL, Big Data technologies and good domain expertise in financial services (e.g: Securities, secured financing , rates, Liquidity reporting , Derivatives , front office/back-office system , trading lifecycle)
Key Responsibilities:
- Provide L2 production support for mission critical liquidity reporting and financial applications, ensuring high availability and performance.
- Monitor and resolve incidents related to trade capture, batch failures, position keeping, market data, pricing, risk and liquidity reporting.
- Proactively manage alerts, logs and jobs using Autosys, Unix tools, and monitoring platforms (ITRS/AWP ).
- Execute advance SQL queries and scripts for data analysis, validation, and issue resolution.
- Support multiple applications build on stored proc, SSIS, SSRS, Big data ecosystems (hive, spark, Hadoop) and troubleshoot data pipeline issues.
- Maintain and improve knowledge bases, SOPs, and runbooks for production support.
- Participate in change management and release activities, including deployment validations.
- Lead root cause analysis (RCA) , conduct post incident reviews, and drive permanent resolutions.
- Collaborate with infrastructure teams on capacity, performance, and system resilience initiatives.
- Contribute to continuous service improvement, stability management and automation initiatives .
Required Skills & Qualification:
- Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field
- 3+ Years of experience in application or production support with 2+ years at a Advance level
- Strong hands-on experience with
- Unix/Linus (scripting, file manipulation, job control)
- SQL (MSSQL/Oracle or similar, Stored proc, SSIS, SSRS)
- Big Data technologies (Hadoop, Hive, Spark)
- Job Schedulers like Autosys
- Log analysis tools
- Solid understanding of financial instruments and trade lifecycle (Equities, Fixed incomes, Secured Financing, Derivatives, Liquidity management)
- Knowledge of front office/back office and reporting workflows and operations
- Excellent analytical and problem-solving skills , with the ability to work in a time-sensitive environment
- Effective communication and stakeholder management skills across business and technical teams
- Experience with ITIL processes, including incident, problem and change management.
How You’ll Succeed
Be conscientious and consistent in identifying security vulnerabilities and working with the respective engineering teams and stakeholders to provide sound guidance and remediations. Be a team player, and a keen learner.
Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you’ll have the opportunity to grow your career, give back to your community and make a real impact.
Take the next step in your career, apply for this role at Citi today
v
#J-18808-Ljbffr25886255 FRMT Application Support Intermediate Analyst (Unix, SQL, Big Data technologies)
Posted today
Job Viewed
Job Description
Whether you're at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you'll have the opportunity to expand your skills and make a difference at one of the world's most global banks. We're fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You'll also have the chance to give back and make a positive impact where we live and work through volunteerism.
Shape your Career with Citi
Citibank serves as a trusted advisor to our retail, mortgage, small business and wealth management clients at every stage of their financial journey. Through Citi's Access Account, Basic Banking, Citi Priority, Citigold and Citigold Private Client, we offer an array of products, services and digital capabilities to clients across the full spectrum of consumer banking needs worldwide.
We're currently looking for a high caliber professional to join our team as FRMT Application Support Intermediate Analyst based in Singapore. Being part of our team means that we'll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance:
We are seeking an experienced candidate to join our Production/Application support team. The ideal candidate will bring a blend of strong technical skills Unix, SQL, Big Data technologies and good domain expertise in financial services (e.g: Securities, secured financing, rates, Liquidity reporting, Derivatives, front office/back-office system, trading lifecycle)
Key Responsibilities:
- Provide L2 production support for mission critical liquidity reporting and financial applications, ensuring high availability and performance.
- Monitor and resolve incidents related to trade capture, batch failures, position keeping, market data, pricing, risk and liquidity reporting.
- Proactively manage alerts, logs and jobs using Autosys, Unix tools, and monitoring platforms (ITRS/AWP).
- Execute advance SQL queries and scripts for data analysis, validation, and issue resolution.
- Support multiple applications build on stored proc, SSIS, SSRS, Big data ecosystems (hive, spark, Hadoop) and troubleshoot data pipeline issues.
- Maintain and improve knowledge bases, SOPs, and runbooks for production support.
- Participate in change management and release activities, including deployment validations.
- Lead root cause analysis (RCA) , conduct post incident reviews, and drive permanent resolutions.
- Collaborate with infrastructure teams on capacity, performance, and system resilience initiatives.
- Contribute to continuous service improvement, stability management and automation initiatives.
Required Skills & Qualification:
- Bachelor's or master's degree in computer science, Information Technology, Engineering, or a related field
- 3+ Years of experience in application or production support with 2+ years at a Advance level
- Strong hands-on experience with
- Unix/Linus (scripting, file manipulation, job control)
- SQL (MSSQL/Oracle or similar, Stored proc, SSIS, SSRS)
- Big Data technologies (Hadoop, Hive, Spark)
- Job Schedulers like Autosys
- Log analysis tools
- Solid understanding of financial instruments and trade lifecycle (Equities, Fixed incomes, Secured Financing, Derivatives, Liquidity management)
- Knowledge of front office/back office and reporting workflows and operations
- Excellent analytical and problem-solving skills , with the ability to work in a time-sensitive environment
- Effective communication and stakeholder management skills across business and technical teams
- Experience with ITIL processes, including incident, problem and change management.
How You'll Succeed
Be conscientious and consistent in identifying security vulnerabilities and working with the respective engineering teams and stakeholders to provide sound guidance and remediations. Be a team player, and a keen learner.
Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you'll have the opportunity to grow your career, give back to your community and make a real impact.
Take the next step in your career, apply for this role at Citi today
v
Tell employers what skills you haveDerivatives
Big Data
High Availability
Hadoop
Root Cause Analysis
Unix
Securities
SQL
Application Support
Wellbeing
ITIL
Small Business
Autosys
SSIS
Equities
SSRS
25886255 FRMT Application Support Intermediate Analyst (Unix, SQL, Big Data technologies)
Posted 4 days ago
Job Viewed
Job Description
Whether you’re at the start of your career or looking to discover your next adventure, your story begins here. At Citi, you’ll have the opportunity to expand your skills and make a difference at one of the world’s most global banks. We’re fully committed to supporting your growth and development from the start with extensive on-the-job training and exposure to senior leaders, as well as more traditional learning. You’ll also have the chance to give back and make a positive impact where we live and work through volunteerism.
Shape your Career with Citi
Citibank serves as a trusted advisor to our retail, mortgage, small business and wealth management clients at every stage of their financial journey. Through Citi's Access Account, Basic Banking, Citi Priority, Citigold and Citigold Private Client, we offer an array of products, services and digital capabilities to clients across the full spectrum of consumer banking needs worldwide.
We’re currently looking for a high caliber professional to join our team as FRMT Application Support Intermediate Analyst based in Singapore. Being part of our team means that we’ll provide you with the resources to meet your unique needs, empower you to make healthy decision and manage your financial well-being to help plan for your future. For instance:
We are seeking an experienced candidate to join our Production/Application support team. The ideal candidate will bring a blend of strong technical skills Unix, SQL, Big Data technologies and good domain expertise in financial services (e.g: Securities, secured financing , rates, Liquidity reporting , Derivatives , front office/back-office system , trading lifecycle)
Key Responsibilities:
- Provide L2 production support for mission critical liquidity reporting and financial applications, ensuring high availability and performance.
- Monitor and resolve incidents related to trade capture, batch failures, position keeping, market data, pricing, risk and liquidity reporting.
- Proactively manage alerts, logs and jobs using Autosys, Unix tools, and monitoring platforms (ITRS/AWP ).
- Execute advance SQL queries and scripts for data analysis, validation, and issue resolution.
- Support multiple applications build on stored proc, SSIS, SSRS, Big data ecosystems (hive, spark, Hadoop) and troubleshoot data pipeline issues.
- Maintain and improve knowledge bases, SOPs, and runbooks for production support.
- Participate in change management and release activities, including deployment validations.
- Lead root cause analysis (RCA) , conduct post incident reviews, and drive permanent resolutions.
- Collaborate with infrastructure teams on capacity, performance, and system resilience initiatives.
- Contribute to continuous service improvement, stability management and automation initiatives .
Required Skills & Qualification:
- Bachelor’s or master’s degree in computer science, Information Technology, Engineering, or a related field
- 3+ Years of experience in application or production support with 2+ years at a Advance level
- Strong hands-on experience with
- Unix/Linus (scripting, file manipulation, job control)
- SQL (MSSQL/Oracle or similar, Stored proc, SSIS, SSRS)
- Big Data technologies (Hadoop, Hive, Spark)
- Job Schedulers like Autosys
- Log analysis tools
- Solid understanding of financial instruments and trade lifecycle (Equities, Fixed incomes, Secured Financing, Derivatives, Liquidity management)
- Knowledge of front office/back office and reporting workflows and operations
- Excellent analytical and problem-solving skills , with the ability to work in a time-sensitive environment
- Effective communication and stakeholder management skills across business and technical teams
- Experience with ITIL processes, including incident, problem and change management.
How You’ll Succeed
Be conscientious and consistent in identifying security vulnerabilities and working with the respective engineering teams and stakeholders to provide sound guidance and remediations. Be a team player, and a keen learner.
Working at Citi is far more than just a job. A career with us means joining a family of more than 230,000 dedicated people from around the globe. At Citi, you’ll have the opportunity to grow your career, give back to your community and make a real impact.
Take the next step in your career, apply for this role at Citi today
v
AI/Data Science/Data Engineering Specialist
Posted today
Job Viewed
Job Description
- Job Title: Senior AI/Data Science/Data Engineering Specialist (Consulting to Full-Time)Location: Remote ( 6:30 PM IST to 3:30 AM IST)Job Type: Opportunity: Transition to Full-Time Based on Performance Start Date: Immediate Joiners PreferredAbout the Role:We are seeking a Senior AI/Data Science/Data Engineering Specialist for a high-impact role supporting advanced data initiatives. This role requires working during US business hours (India time: 6:30 PM IST to 3:30 AM IST).Key Responsibilities: Design, develop, and deploy AI/ML models and data pipelines in production-grade environments. Perform advanced data analysis, feature engineering, and exploratory data modeling. Build, optimize, and maintain ETL/ELT processes using large-scale datasets. Collaborate with cross-functional teams to identify data-driven solutions to business problems. Implement and maintain scalable machine learning pipelines. Ensure high code quality, maintainability, and documentation.Must-Have Skills: Strong expertise in Python for data science, scripting, and automation 5+ years of hands-on experience in AI/ML model development, data engineering, or data science Experience with Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, or similar libraries Proficient in SQL, data modeling, and cloud data tools (AWS, GCP, or Azure) Experience with Apache Airflow, Spark, or other big data tools is a plus Strong understanding of data warehousing, pipelines, and MLOps conceptsPreferred Qualifications: Prior experience in consulting engagements with US clients Familiarity with CI/CD pipelines and version control (Git) Excellent communication and problem-solving skills Ability to work independently in a remote, cross-time-zone environmentJob Types: Full-time, Part-time, Contractual / TemporarySchedule: Night shift US shiftExperience: Data science: 5 years (Required) AI: 1 year (Required)Shift availability: Night Shift (Required)Work Location: Remote,
Sign-in & see how your skills match this job
Find Your perfect Job
Sign-in & Get noticed by top recruiters and get hired fast
#J-18808-LjbffrData Engineering Lead
Posted today
Job Viewed
Job Description
- At Teksalt Solutions, we specialize in connecting top-tier talent with leading companies to create dynamic, productive workforces. We are committed to delivering technology solutions that not only meet but exceed the demands of the modern business landscape.We are currently seeking a Data Engineering Lead - AWS Glue & PySpark Specialist for a permanent full-time position in Bangalore. The ideal candidate should have 5 to 8 years of experience with skills in AWS Glue, PySpark, and Python.Key Responsibilities:- Spark & PySpark Development: Design and implement scalable data processing pipelines using Apache Spark and PySpark for large-scale data transformations.- ETL Pipeline Development: Build, maintain, and optimize ETL processes for seamless data extraction, transformation, and loading across various data sources and destinations.- AWS Glue Integration: Utilize AWS Glue to create, run, and monitor serverless ETL jobs for data transformations and integrations in the cloud.- Python Scripting: Develop efficient, reusable Python scripts to support data manipulation, analysis, and transformation within the Spark and Glue environments.- Data Pipeline Optimization: Ensure that all data workflows are optimized for performance, scalability, and cost-efficiency on the AWS Cloud platform.- Collaboration: Work closely with data analysts, data scientists, and other engineering teams to create reliable data solutions that support business analytics and decision-making.- Documentation & Best Practices: Maintain clear documentation of processes, workflows, and code while adhering to best practices in data engineering, cloud architecture, and ETL design.Required Skills:- Expertise in Apache Spark and PySpark for large-scale data processing and transformation.- Hands-on experience with AWS Glue for building and managing ETL workflows in the cloud.- Strong programming skills in Python, with experience in data manipulation, automation, and integration with Spark and Glue.- In-depth knowledge of ETL principles and data pipeline design, including optimization techniques.- Proficiency in working with AWS services, such as S3, Glue, Lambda, and Redshift.- Strong skills in writing optimized SQL queries, with a focus on performance tuning.- Ability to translate complex business requirements into practical technical solutions.- Familiarity with Apache Airflow for orchestrating data workflows.- Knowledge of data warehousing concepts and cloud-native analytics tools.If you are passionate about data engineering and have the required skills and experience, we welcome you to apply for this position. Join us at Teksalt Solutions, where a pinch of us makes all the difference in the world of technology.,
Sign-in & see how your skills match this job
Sign-in & Get noticed by top recruiters and get hired fast
#J-18808-LjbffrManager, Data Engineering
Posted today
Job Viewed
Job Description
Key Responsibilities:
- Lead and manage a team of Data Analysts.
- Design, build, and maintain scalable and efficient data pipelines and architecture.
- Ensure data quality, governance, security, and compliance.
- Perform ETL operations across multiple data sources and platforms.
- Improve and optimize data storage, retrieval, and scalability.
- Collaborate across business units to deliver data-driven solutions.
- Drive initiatives in advanced analytics, AI/ML, and emerging data technologies.
- Own and manage Power BI reporting framework and delivery.
- Manage analytics project timelines and deliverables.
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- 6+ years of experience in data engineering and analytics roles.
- Strong proficiency in Python, SQL, and Azure cloud platform.
- Experience with big data tools (e.g., Spark, Hadoop, Kafka).
- Knowledge of data modeling, data warehousing, and architecture.
- Strong leadership, project management, and communication skills.
EA License: 01C4394, R1872517
By sending us your personal data and curriculum vitae (CV), you are deemed to consent to PERSOLKELLY Singapore Pte Ltd and its affiliates collecting, using and disclosing my personal data for the purposes set out in the Privacy Policy which is available at I also acknowledge that I have read, understood, and agree to the said Privacy Policy.
Intern, Data Engineering
Posted today
Job Viewed
Job Description
Position overview:
You will be resposible in assisting with data collection, analysis and reporting.
Key responsibilities:
- Work closely with procurements datasets to build data visualisation and dashboards that inform strategic planning and sound business decisions
- Understand functional and technical requirements from Procurements team to build reports that deliver actionable insights to key stakeholders
- Assist the Procurement team to build and maintain data infrastructure (data pipelines, aggregated data sets, reports, dashboards) to facilitate development of key metrics to measure efficiency/impact/outcomes of day-to-day operations
- Automate Procurement data collection and streamline existing Procurement manual processes to enhance operational efficiency.
Job Specifications (Criteria of Eligibility):
- Currently pursuing a Bachelor's degree or higher, preferably major in Analytics, Mathematics, Statistics, Engineering, Computer Science or related field
- Familiar with at least one programming/scripting language, preferably Python, SQL or Spark and familiar with Excel or Power BI
- Experience with data modeling concepts, star schema and data vault
- Experience in data analysis and data collection required
Strategic Planning
Data Analysis
Big Data
Data Modeling
Pipelines
Mathematics
ETL
Procurement
SQL
Python
Statistics
Java
Power BI
Databases
Data Visualisation
Be The First To Know
About the latest Big data technologies Jobs in Singapore !
Data Engineering Lead
Posted today
Job Viewed
Job Description
Data Engineer Position
Are you looking for a challenging role that combines technical skills with business acumen? We have an exciting opportunity for a Data Engineer to join our team. In this role, you will be responsible for designing, building, and maintaining large-scale data systems. If you are passionate about working with data and enjoy solving complex problems, this could be the perfect fit for you.
As a Data Engineer, your primary responsibility will be to develop and maintain our company's data infrastructure. This includes designing and implementing data pipelines, ensuring data quality, and developing tools to support business intelligence initiatives. You will also be responsible for collaborating with cross-functional teams to gather requirements, design solutions, and implement changes. Additionally, you will be required to perform regular maintenance tasks such as backup and recovery, capacity planning, and performance tuning.
We are looking for a highly motivated individual with excellent problem-solving skills and experience in data engineering. If you have a strong background in computer science, mathematics, or a related field and are proficient in programming languages such as Python, Java, and Scala, we encourage you to apply. Experience with big data technologies such as Hadoop, Spark, and NoSQL databases is also a plus.
This is an exciting opportunity to work with cutting-edge technologies and contribute to the growth of our company. As a Data Engineer, you will have the chance to make a real impact on our business and help shape the future of our organization.
- Bachelor's degree in Computer Science, Mathematics, or a related field
- 5+ years of experience in data engineering or a related field
- Excellent problem-solving skills and attention to detail
- Strong understanding of data structures and algorithms
- Experience with big data technologies such as Hadoop, Spark, and NoSQL databases
- Proficiency in programming languages such as Python, Java, and Scala
- Ability to work collaboratively with cross-functional teams
- Strong communication and project management skills
About Us
We are a leading provider of innovative solutions in the field of data engineering. Our team is passionate about delivering high-quality products and services that meet the needs of our clients. We believe in fostering a culture of collaboration, innovation, and continuous learning. If you share our values and are looking for a new challenge, please submit your application today.
Data Engineering Expert
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineering Expert to join our team. The ideal candidate will have expertise in designing, building, and maintaining scalable data pipelines for efficient data processing.
Key Responsibilities:
Data Collection:
- Identify and gather data from various sources, including databases, APIs, and third-party services.
- Ensure the quality, consistency, and integrity of collected data.
- Automate data collection processes where possible.
Data Pipeline Design:
- Architect and implement scalable data pipelines for efficient data processing.
- Develop ETL (Extract, Transform, Load) processes to clean, transform, and load data into storage solutions like data lakes or warehouses.
- Ensure pipeline optimization for performance, reliability, and security.
- Implement monitoring and logging to maintain pipeline health.
Data Storage and Management:
- Design and manage data storage solutions to ensure easy access and retrieval.
- Optimize data storage for cost-effectiveness and query performance.
- Implement data governance practices for compliance with organizational and legal requirements.
Data Visualization and Dashboard Development:
- Develop interactive dashboards using tools like Tableau or similar platforms.
- Collaborate with stakeholders to understand their needs and tailor visualizations to support decision-making.
- Ensure dashboards are user-friendly, visually appealing, and provide real-time or near-real-time data updates.
Insight Generation and Reporting:
- Analyze data to uncover trends, patterns, and anomalies that provide actionable insights.
- Prepare and present reports to stakeholders, translating complex data into clear and concise narratives.
- Provide recommendations based on data analysis to inform strategic initiatives.
Requirements:
To be successful in this role, you will need:
- Diploma and above
- Good English language skills (spoken and written)
- Ability to self-drive initiatives with limited guidance and coordinate with multiple parties with problem-solving skills
- Experience in designing, building, and maintaining batch and real-time data pipelines
- Experience with Databricks
- Experience with data visualization tools like Tableau/Tableau Cloud
- Experience implementing technical processes to enforce data security, data quality, and data governance
Duration/Working Hours:
This is a 12-month contract position requiring 42 hours of work per week.
Data Engineering Specialist
Posted today
Job Viewed
Job Description
We are seeking a skilled Data Engineering Specialist to join our team. The successful candidate will be responsible for designing, developing, and maintaining large-scale data systems.
Job Description:The ideal candidate will have proven experience in data analysis, reporting, and business intelligence. They should be proficient in SQL (Snowflake) and Python programming languages.
Experience in Machine Learning is an added benefit, as the candidate will need to work with complex data sets and develop predictive models.
They must also have strong analytical and problem-solving skills, as well as excellent attention to detail and communication skills.
Familiarity with data warehousing concepts and database management systems
Ability to work independently and collaboratively in a fast-paced environment
Strong organisational and time management skills to handle multiple projects and meet deadlines
Proficiency in SQL (Snowflake) and Python programming languages
Opportunity to work on challenging projects and contribute to the growth of the company
Collaborative and dynamic work environment
Competitive salary and benefits package
Please submit your resume and cover letter to apply for this exciting opportunity.