Data Engineer (Digital Wallet) -US/ Hybrid
Role: Data Engineer (Digital Wallet) Location: Scottsdale AZ (hybrid) Duration: FTE & C2C Job Description: Must have : Skillset: Java, Scala, S3, Glue, aws , Redshift. 6-8 years of IT experience focusing on enterprise data architecture and management. β’ Experience in Conceptual/Logical/Physical Data Modeling & expertise in Relational and Dimensional Data Modeling β’ Experience with Databricks & on Prem, Structured Streaming, Delta Lake concepts, and Delta Live Tables required β’ Experience with Spark Scala and Java programming β’ Data Lake concepts such as time travel and schema evolution and optimization β’ Structured Streaming and Delta Live Tables with Databricks as a bonus β’ Experience leading and architecting enterprise-wide initiatives specifically system integration, data migration, transformation, data warehouse build, data mart build, and data lakes implementation/support β’ Advanced level understanding of streaming data pipelines and how they differ from batch systems β’ Formalize concepts of how to handle late data, defining windows, and data freshness β’ Advanced understanding of ETL and ELT and ETL/ELT tools such as Data Migration Service etc β’ Understanding of concepts and implementation strategies for different incremental data loads such as tumbling window, sliding window, high watermarks, etc. β’ Familiarity and/or expertise with Great Expectations or other data quality/data validation frameworks a bonus β’ Familiarity with concepts such as late data, defining windows, and how window definitions impact data freshness β’ Advanced level SQL experience (Joins, Aggregation, Windowing functions, Common Table Expressions, RDBMS schema design performance optimization) β’ Indexing and partitioning strategy experience β’ Debug, troubleshoot, design, and implement solutions to complex technical issues β’ Experience with large-scale, high-performance enterprise big data application deployment and solution β’ Architecture experience in AWS environment a bonus β’ Familiarity working with Lambda specifically with how to push and pull data, how to use AWS tools to view data for processing massive data at scale a bonus β’ Experience with Gitlabs and CloudWatch and ability to write and maintain Gitlabs for supporting CI/CD pipelines β’ Experience working with AWS Lambdas for configuration and optimization and experience with S3 β’ Familiarity with Schema Registry, and message formats such as Avro, ORC, etc. β’ Ability to thrive in a team-based environment β’ Experience briefing the benefits and constraints of technology solutions to technology partners, stakeholders, team members, and senior-level of management Apply tot his job