PeopleReady - Senior Cloud Data Engineer | Poland
Category: | Information Technology |
Position Type: | Permanent |
Attachments: | No File Attached |
Job Description
Location – Remote, Poland
We are looking for a Senior Cloud Data Engineer to join our team of data experts. This person is responsible for expanding and optimizing our Data Lake and Data Warehouse architecture and optimizing for cost and performance. The ideal candidate is experienced with extensive data platforms and enjoys optimizing and building them from the ground up.
The Senior Cloud Data Engineer will work alongside our Data Analytics Engineers, Data Scientists, Database Administrators and Software Engineers on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. As Senior Cloud Data Engineer you must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. As the right candidate, you will be excited by the prospect of designing & optimizing our company's data architecture to support our next generation of products and data initiatives.
Responsibilities
- Design and maintain data platform in AWS Cloud
- Close cooperation with stakeholders in building end to end data products
- Build the ELT/ETL infrastructure required for data onboarding using AWS technologies
- Assemble large, complex data sets that meet functional / non-functional business requirements
- Designing and implementing standards across Data Teams
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Create data tools for analytics team members that assist them in building and optimizing our product into an innovative industry leader
- Taking care for protecting data from unauthorized usage
Qualification
- Solid understanding of AWS cloud services, in particular: Glue, DMS, Redshift, Athena, S3, SageMaker
- Experience with building Data Lake and Data Warehouse
- Experience with data pipeline tools, like Spark, Kinesis, Kafka
- Experience with Python and SQL
- Experience with Orchestration Tools, for example: Airflow, EventBridge, SQS, SNS, etc.
Good to have
- Experience with dbt
- Infrastructure and DevOps / DataOps tools and languages: CDK, CloudFormation, GitLab, etc.
- Experience in ML / ML Ops area
- Experience with Docker and container orchestration services like ECS or Kubernetes
- Experience with Snowflake
What's in it for me?
- Competitive offer package including multisport card, health insurance & vouchers
- The best medical coverage on the market with free dental care
- Annual bonus plan
- Remote work from home
- One extra day annually to use for holiday
- Opportunity to contribute to developing and maintaining a live platform product making an impact on people's lives
Next steps
If this sounds like you, we would love to hear from you. Click the apply button and start your application today!