The Senior Data engineer role is centered around our core data systems – Data Warehouse / Data Lake / BI. This position will be responsible for the coding /review / acceptance of Data Movement / Data Integration code to be promoted into production. The Data Engineering candidate will be involved in projects from inception through design and development and into post production support, to ensure that items developed are of high quality and will not damage production. The Data Engineering candidate will utilize the full suite of integration tools and technologies and practices ensuring compliance to configuration management and security best practices across the organization. The Data Engineering role is a combined role covering design, development and operations responsibilities.
What You'll Do:
- Develops and maintains scalable data pipelines and builds out new integrations to support continuing increases in data volume and complexity.
- Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
- Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Writes unit/integration tests, contributes to engineering wiki, and documents work.
- Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.
- Works closely with a team of frontend and backend engineers, product managers, and analysts.
- Defines company data assets (data models), documentation, transformation jobs to populate data models.
- Designs data integrations and data quality framework.
- Designs and evaluates open source and vendor tools for data lineage.
- Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.
What You'll Need:
Senior Data Engineer requires experience with the following:
- University Degree or College Equivalent
- 10+ years IT experience in large scale technology architecture, operations and design related disciplines
- 3+ years developing and supporting applications leveraging the Cloud technology stack (AWS, Azure, GCP)
- 3+ of experience with data integration/ETL tools such as Databricks, DataStage or alternatives
- SDLC knowledge in both waterfall and agile methodologies
- Hands-on experience with source code management system (SVN, Git) and continuous integration tools (Jenkins, Azure DevOps)
- Experience on following tools: SQL, Scala, Python, Spark, Kafka, and other related languages
- Experience on handling data processing, delivering distributed and highly scalable application
- Experience with building solutions in Cloud based environment (Azure, GCP experience preferred)
- Experience with large scale domain or enterprise solution analysis development, selection, and implementation
- Experience with high-volume, transaction processing software applications
- Good understanding of workload management, schedulers, scalability and distributed platform architectures
- Experience in software development and architecture experience using Java EE technologies (Application Server, Enterprise Service Bus, SOA, Messaging, Data Access Layers)
- Experience in scripting languages & automation such as bash, PERL, and Python
- Experience in data warehousing, analytics, and business intelligence/visualization/presentation
- Experience using SQL against relational databases
- 5+ years of hands-on experience on Linux , AIX, and z/OS
Knowledge, Skill and Ability Requirements:
- Excellent communication skills (both written and oral) combined with strong interpersonal skills
- Strong analytical skills and thought processes combined with the ability to be flexible and work analytically in a problem-solving environment
- Attention to detail
- Strong organizational & multi-tasking skills