Description
Hi Team
Requirement: Data Engineer –
Exp: 5 to 8 years
Mandatory skills- Pyspark or Spark
Python, SQL
Cloud- AWS or Azure or GCP
Creation of Data Pipeline
Salary- Years into 4 times
Notice- Immediate to 30 days
Location - Pune,Bangalore, Hyderabad, Chennai
Languages: Python, pyspark.
• AWS** : Glue, Lambda, Athena, Lake Formation, ECS, IAM, SQS, SNS, KMS
• Spark** (Experience in AWS PaaS not only scoped to EMR or On-prem like cloudera)
Secondary skills
• Airflow (Good to have understanding)
• PyTest or UnitTest (Any testing Framework)
• CI/CD : Drone or CircleCI or TravisCI or any other tool (Understanding of how it works)
• Understanding of configuration framework (YAML, JSON, star lark etc)
• Terraform (IaC)
• Kubernetes (Understanding of containers and their deployment).
Notice Period: 0-30days pan India location.