Description
The ideal candidate will be responsible for developing high-quality applications. They will also be responsible for designing and implementing testable and scalable code.
Skill Required :- Matillion and R
Responsibilities:
• Assist in product development using cloud platforms for ELT
• Use claims/patient data business rules to build out data flows to meet analysis objectives
• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Matillion, Snowflake, and Python
• Build analytics tools using Power BI and Tableau that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
• Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
• Use GitHub SCM for version and release management
• Perform test and QC on all development work
• Work with data and analytics experts to strive for greater functionality in our data systems
• Candidate need to comply with applicable Information security responsibilities..
Qualifications:
• Experience with cloud data platforms, preferably Snowflake and Matillion ELT
• Experience working with large datasets (Billions of rows)
• Experience with cloud data management tools: Azure Storage Explorer, FTP, Azure BLOB Storage, S3 Buckets
• Experience with R Programming Python
• Strong SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Snowflake or other cloud database experience preferred
• Experience building and optimizing claims data pipelines, architectures/data models and data sets.
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
• Experience with the Agile workflow, Test Driven Development, and continuous improvement/continuous deployment.
• Strong analytic skills related to working with structured and unstructured datasets.
• Build processes supporting data transformation, data structures, metadata, dependency and workload management using Azure Automation, Azure Function Apps, and Azure Data Factory.
• Strong self-organizational skills.
• We are looking for a candidate with 1+ years of experience in a Data Engineer or Software Engineer role, who has attained a Graduate degree in Software Engineering, Computer Science, or another quantitative field. They should also have experience using the following software/tools:
• Experience with big data tools: Matillion or other ETL/ELT tools
• Experience with building relational SQL databases
• Experience with data pipeline and workflow management tools: Azure Data Factory, SSIS,
• Python
• Experience with Git/GitHub and automating builds.
Nice to have, but not required:
• Experience working with US closed and open claims datasets (e.g. Marketscan, Pharmetrics, IQUVIA)
• Experience working with longitudinal patient data
• Experience with epidemiological terms and applications to patient data sets