Our premier financial & banking client is seeking a Senior Data Engineer to join their team as contract to hire hybrid employee to work 2 days onsite in Scottsdale, AZ and 3 days work remotely.
*CLIENT NOT ABLE TO PROVIDE SPONSORSHIP*
JOB SUMMARY & PRINCIPAL DUTIES:
- A solid experience and understanding of considerations for large-scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
- Monitors the Data Lake and Warehouse to ensure that the appropriate support teams are engaged at the right times.
- Design, build and test scalable data ingestion pipelines, perform end to end automation of ETL process for various datasets that are being ingested.
- Participate in peer review and provide feedback to the engineers keeping development best practices, business and technical requirements in view
- Determine best way to extract application telemetry data, structure it, send to proper tool for reporting (Kafka, Splunk).
- Work with business and cross-functional teams to gather and document requirements to meet business needs.
- Provide support as required to ensure the availability and performance of ETL/ELT jobs.
- Provide technical assistance and cross training to business and internal team members.
- Collaborate with business partners for continuous improvement opportunities.
JOB SPECIFICATIONS:
Education: Bachelor's Degree in Computer Science, Information Technology, Engineering, or related field
Experience, Skills & Qualifications:
- 6+ years of experience in Data Engineering with an emphasis on Data Warehousing and Data Analytics.
- 4+ years of experience with one of the leading public clouds.
- 4+ years of experience in design and build of salable data pipelines that deal with extraction, transformation, and loading.
- 4+ years of experience with Python, Scala with working knowledge on Notebooks.
- 2+ years hands on experience on GCP Cloud data implementation projects (Dataflow, DataProc, Cloud Composer, Big Query, Cloud Storage, GKE, Airflow, etc.).
- At least 2 years of experience in Data governance and Metadata Management.
- Ability to work independently, solve problems, update the stake holders.
- Analyze, design, develop and deploy solutions as per business requirements.
- Strong understanding of relational and dimensional data modeling.
- Experience in DevOps and CI/CD related technologies.
- Excellent written, verbal communication skills, including experience in technical documentation and ability to communicate with senior business managers and executives.
Job Types: Full-time, Contract
Pay: $90.00 - $100.00 per hour
Expected hours: 40 per week
Benefits:
- 401(k)
- Dental insurance
- Flexible schedule
- Health insurance
- Paid time off
- Vision insurance
Schedule:
Experience:
- Data Warehousing/Data Analytics: 6 years (Required)
- Python or Scala: 4 years (Required)
- Google Cloud Platform implementation: 2 years (Required)
Ability to Relocate:
- Scottsdale, AZ: Relocate before starting work (Required)
Work Location: In person