Gcp Data Engineer

FULL_TIME 3 weeks ago
Employment Information

Role & responsibilities

- Expertise in data processing frameworks: Apache Beam (Data Flow)

- Active experience on GCP tools and technologies like Big Query, Dataflow, Cloud Composer, Cloud Spanner, GCS, DBT etc.,

 - Data Engineering skillset using Python, SQL

- Experience in ETL (Extract, Transform, Load) processes

 - Knowledge of DevOps tools like Jenkins, GitHub, Terraform is desirable. Should have good knowledge on Kafka (Batch/ streaming)

- Understanding of Data models and experience in performing ETL design and build, database replication using Message based CDC

- Familiarity with cloud storage solutions

- Strong problem-solving abilities in data engineering challenges

- Understanding of data security and scalability