Big Data Developer / Analyst
Big Data Developer / Analyst | Standard Chartered Bank | Full-time | Diploma/Degree | Hadoop, Apache Hive, PySpark, SQL, Azure DevOps, Control-M, Bash Scripting, Dataiku, Tableau, MSTR, ELT/ETL, HDFS, Agile, SDLC, Reference Data Management, Problem-Solving, Communication
Contributed to Financial Crime Surveillance Operations (FCSO) through data engineering solutions, handling raw and structured data across Big Data platforms. Supported data pipelines, analytics/reporting infrastructure, and regulatory requirements using industry-standard tools. Collaborated with cross-functional stakeholders to build scalable and secure data processing workflows.
Key Responsibilities:
- Built and optimized ELT/ETL pipelines using Hadoop, Hive, and PySpark for regulatory and business use cases.
- Maintained and monitored job orchestration using Control-M, ensuring smooth and reliable operations.
- Participated in Agile ceremonies, version control (Azure DevOps), and SDLC workflows for development and deployment.
- Supported reporting and analytics platforms (Tableau, MSTR, Dataiku) for FCSO dashboards and insights.
- Ensured compliance with internal risk, control, and conduct standards, contributing to risk mitigation efforts.
Tools/Technologies Used: Hadoop, Hive, HDFS, PySpark, SQL, Bash, Control-M, Azure DevOps, Tableau, Dataiku, MSTR
Notable Achievements:
- Streamlined big data ingestion workflows, reducing latency by 30% in daily surveillance reports.
- Played a key role in implementing automated job monitoring scripts, reducing manual intervention and job failures.
- Recognized for contributing to the data governance setup for FCSO operations, ensuring high data quality and compliance.