You are using an outdated browser. For a faster, safer browsing experience, upgrade for free today.

A Google Cloud Platform (GCP) data engineer designs, builds, and maintains data pipelines to support business needs. They work with analysts, engineers, and planners to create scalable, reliable, and efficient data pipelines.

 

  • Role: GCP Data Engineer
  • Location:  All PSL Locations
  • Experience:  6-12 Years
  • Job Type: - Full Time Employment

 

What You"ll Do:

 

  • Your role is focused on Design, Development and delivery of solutions involving Data Integration, Processing & Governance.
  • Data Storage and Computation Frameworks, Performance Optimizations.
  • Analytics & Visualizations.
  • Infrastructure & Cloud Computing.
  • Data Management Platforms.
  • Implement scalable architectural models for data processing and storage.
  • Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode.
  • Build functionality for data analytics, search and aggregation.

 

Expertise You"ll Bring:

 

  • Overall 6+ years of IT experience with 3+ years in Data related technologies.
  • Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP).
  • Hands-on experience with the Hadoop stack ? HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
  • Strong experience in at least of the programming language Java, Scala, Python. Java preferable.
  • Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc.
  • Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security.
  • Good knowledge of traditional ETL tools (Informatica, Talend, etc.) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience.
  • Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc.
  • Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures.
  • Performance tuning and optimization of data pipelines.
  • CI/CD ? Infra provisioning on cloud, auto build & deployment pipelines, code quality.
  • Cloud data specialty and other related big data technology certifications.

Responsibilities

ETL,Bit bucket,Vertex AI

ETL,Bit bucket,Vertex AI,GCP,AWS,Python,Data Integration,Testing,CONTROL,Project Management,Quality Assurance,Assurance,SSIS

6 to 12 Years

Pune / Mumbai / Nagpur / Hyderabad / Bengaluru / Goa / Kolkata / Jaipur / Noida


Share this vacancy

Related Jobs

Fulltime
Fulltime
Fulltime
Fulltime
Fulltime

Contact TaaS Technologies Pvt Ltd