Menu

Data Engineering

Location: Mumbai, Maharashtra

Category: IT Engineer & Developer Jobs

Responsibilities:● More than 15 years of experience in Technical, Solutioning, and Analytical roles.● 5+ years of experience in building and managing Data Lakes, Data Warehouses, Data Integration, Data Migration, and Busines Intelligence/Artificial Intelligence solutions in the Cloud (GCP/AWS/Azure)● Ability to understand business requirements, translate them into functional and non-functional areas, and define non-functional boundaries in terms of Availability, Scalability, Performance, Security, Resilience, etc.● Experience in architecting, designing, and implementing end-to-end data pipelines and data integration solutions for varied structured and unstructured data sources and targets.● Experience of having worked in distributed computing and enterprise environments like Hadoop, GCP/AWS/Azure Cloud.● Well-versed with various Data Integration, and ETL technologies in Cloud like Spark, Pyspark/Scala, Dataflow, DataProc, EMR, etc. on various Cloud.● Experience of having worked with traditional ETL tools like Informatica/DataStage/OWB/Talend, etc.● Deep knowledge of one or more Cloud and On-Premise Databases like Cloud SQL, Cloud Spanner, Big Table, RDS, Aurora, DynamoDB, Oracle, Teradata, MySQL, DB2, SQL Server, etc.● Exposure to any of the No-SQL databases like Mongo dB, CouchDB, Cassandra, Graph dB, etc.● Experience in architecting and designing scalable data warehouse solutions on the cloud on Big Query or Redshift.● Experience in having worked on one or more data integration, storage, and data pipeline toolsets like S3, Cloud Storage, Athena, Glue, Sqoop, Flume, Hive, Kafka, Pub-Sub, Kinesis, Dataflow, DataProc, Airflow, Composer, Spark SQL, Presto, EMRFS, etc.● Preferred experience of having worked on Machine Learning Frameworks like TensorFlow, PyTorch, etc.● Good understanding of Cloud solutions for Iaas, PaaS, SaaS, Containers, and Microservices Architecture and Design.● Ability to compare products and tools across technology stacks on Google, AWS, and Azure Cloud.● Good understanding of BI Reposting and Dashboarding and one or more toolsets associated with it like Looker, Tableau, Power BI, SAP BO, Cognos, Superset, etc.● Understanding of Security features and Policies in one or more Cloud environments like GCP/AWS/Azure.● Experience of having worked in business transformation projects for the movement of On-Premise data solutions to Clouds like GCP/AWS/Azure. Role:● Lead multiple data engagements on GCP Cloud for data lakes, data engineering, data migration, data warehouse, and business intelligence.● Interface with multiple stakeholders within IT and business to understand the data requirements.● Take complete responsibility for the successful delivery of all allocated projects on the parameters of Schedule, Quality, and Customer Satisfaction.● Responsible for design and development of distributed, high volume multi-thread batch, real-time, and event processing systems.● Implement processes and systems to validate data, and monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.● Work with the Pre-Sales team on RFP, and RFIs and help them by creating solutions for data.● Mentor Young Talent within the Team, Define and track their growth parameters.● Contribute to building Assets and Accelerators.Other Skills:● Strong Communication and Articulation Skills.● Good Leadership Skills.● Should be a good team player.● Good Analytical and Problem-solving skills

Apply on Company Website You will be redirected to the employer’s website