GCP Data Engineer / Architect

Permanent

Job Reference
(IAD GCP Data Engineer/Architect - 443439
Job Type
Permanent
Location
Area
UK Wide (Remote / Client Site)
Salary Details
£55k - £110k DOE + Benefits
Start Date
ASAP
Key Skills
o AWS / Amazon Web Services (Athena, Redshift, Glue, EMR)
o GCP / Google Cloud Platform
o Java, Scala, Python, Spark, SQL
o Enterprise grade ETL/ELT data pipelines
o Data manipulation/wrangling te
Benefits
Our client offers the chance to not only work on the UK’s most exciting projects, but long-term career prospects, professional development, and a comprehensive and personalised benefits package.
Consultant
Peter Hirst
peter.hirst@deerfoot.co.uk
07917 725773

Apply now

Get in touch with us for more information

Job description

GCP Data Engineer / Architect
Remote Working – UK Wide
Competitive Package £60k - £110k depending on experience
Permanent

As a trusted and preferred recruitment partner to this prestigious global organisation, we have been asked to assist in the hire of several Data Engineers and Data Architects with experience in AWS and/or GCP.
Our client offers the chance to not only work on the UK’s most exciting projects, but long-term career prospects, professional development, and a comprehensive and personalised benefits package.
Applications are accepted from candidates of all abilities. Whilst the below list is extensive, you do not need experience across the board and should still apply if you only meet a few of the below criteria.

Key Responsibilities
• Build and deliver GCP data engineering solutions as part of a larger project
• Use Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.) to build solutions for our customers
• Experience in Spark (Scala/Python/Java) and Kafka.
• Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.
• E2E Data Engineering and Lifecycle (including non-functional requirements and operations) management.
• E2E Solution Design skills - Prototyping, Usability testing and data visualization literacy.
• Experience with SQL and NoSQL modern data stores.
• Build relationships with client stakeholders to establish a high-level of rapport and confidence
• Work with clients, local teams and offshore resources to deliver modern data products
• Use GCP Data focused Reference Architecture
• Design and build data service APIs
• Analyze current business practices, processes and procedures and identify future opportunities for leveraging GCP services
• Design solutions and support the planning and implementation of data platform services including sizing, configuration, and needs assessment
• Implement effective metrics and monitoring processes

Relevant Experience (Not all Essential)
• Minimum 3-4 years of experience with Google Data Products tools (e.g. BigQuery, Dataflow, Dataproc, AI Building Blocks, Looker, Cloud Data Fusion, Dataprep, etc.)
• Google Cloud Platform
• Java, Scala, Python, Spark, SQL
• Experience of developing enterprise grade ETL/ELT data pipelines.
• Deep understanding of data manipulation/wrangling techniques
• Demonstrable knowledg