Senior gcp data engineer/senior consultant specialist

apartmenthackajob placePune calendar_month 

Job Description

hackajob is collaborating with HSBC to connect them with exceptional professionals for this role.

Some careers shine brighter than others.

If you're looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further.

HSBC is one of the largest banking and financial services organization's in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions.

We are currently seeking an experienced professional to join our team in the role of Senior Consultant Specialist

In this role, you will
  • Migrate and re-engineer existing services from on-premises data centers to Cloud (GCP),Design and Develop Scalable Data Pipelines: Architect and implement end-to-end data workflows using Apache Airflow for orchestration, integrating multiple data sources and sinks across cloud and on-prem environments.
  • BigQuery Data Modeling and Optimization: Build and optimize data models in Google BigQuery for performance and cost-efficiency, including partitioning, clustering, and materialized views to support analytics and reporting use cases.
  • ETL/ELT Development and Maintenance: Design robust ETL/ELT pipelines to extract, transform, and load structured and semi-structured data, ensuring data quality, reliability, and availability.
  • Cloud-Native Engineering on GCP: Leverage GCP services like Cloud Storage, Pub/Sub, Dataflow, and Cloud Functions to build resilient, event-driven data workflows.
  • CI/CD and Automation: Implement CI/CD for data pipelines using tools like Cloud Composer (managed Airflow), Git, and Terraform, ensuring automated deployment and versioning of workflows.
  • Data Governance and Security: Ensure proper data classification, access control, and audit logging within GCP, adhering to data governance and compliance standards.
  • Monitoring and Troubleshooting: Build proactive monitoring for pipeline health and data quality using tools such as Stack driver (Cloud Monitoring) and custom Airflow alerting mechanisms.
  • Collaboration and Stakeholder Engagement: Work closely with data analysts, data scientists, and business teams to understand requirements and deliver high-quality, timely data products.

To be successful in this role, you should meet the following requirements:

  • With Overall Working experience more than 8+ Years primary in DWH/ETL background with following skill sets
  • Mandatory 5+ hands on working experience on Python is good to have.
  • Mandatory 5+ hands on working experience on GCP Bigquery (Mandatory)
  • Mandatory 4+ hands on working experience on Apache Airflow /Cloud Composer (Mandatory)
  • Mandatory 3+ hands on working experience on PL/SQL Scripting (Mandatory)
  • Mandatory 5+ hands on working experience on ETL tools (Mandatory) – Data stage/ Informatica/ Prophecy.
  • Mandatory 3+ hands on handling real time data processing experience with PUB/SUB , Kafka broker or Nifi knowledge
  • GCP Certification on ACE (Associate Cloud Engineer) is added advantage with AI experience

You'll achieve more when you join HSBC.

www.hsbc.com/careers

HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment.

Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website.

Issued by – HSBC Software Development India

apartmentcodvo.aiplacePune
Job Description Senior Data Engineer (Databricks) Experience: 4-8+ yrs About Us At Codvo, we build scalable, future-ready digital platforms that drive real business impact. We foster a culture of innovation, collaboration, and ownership—where...
starFeatured

Data Engineer

apartmentAccentureplacePune
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load...
local_fire_departmentUrgent

IFC - Data Engineer- AVP

apartmentBarclaysplacePune
Job Description Join us as an IFC - Data Engineer- AVP, at Barclays where you will spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings...