GCP Data Engineer | UK wide | Remote | Up to £90,000
If you have a sixth sense for data science, there’s a digital revolution underway in the UK that needs the talents of people who are at home analysing, managing, designing, and predicting complex datasets and breathing life into AI systems.
As a Big Data Engineer, you will join an innovative UK Advanced Analytics team tasked with creating and managing Big Data infrastructure and tools, typically designing and building big data pipelines and platforms.
With a competitive benefits package and a certified Great Place to Work – you will be supported in taking your career anywhere you want it to go.
As Senior Big Data Consultant | GCP Data Engineer you’ll work on the collecting, storing, processing, and analysing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the business.
As a senior consultant and Big Data expert you will build ETL/ELT best practices in a big data environment, featuring big data pipelines, real time streaming using Kafka and leveraging high levels of proficiency across distributed computing principles.
You will be able to select and integrate any Big Data tools and frameworks required to provide requested capabilities, implementing ETL/ELT process, working closely with Cloud big data architecture (GCP) and be an SME in relation to knowledge of big data tools/platforms, data management tools and necessary infrastructure changes.
You will have a track record in building scalable big data solutions with a view to act as a trusted technology advisor, with core knowledge across a variety of tools and platforms in big data landscape, including appreciation of data management/governance and security.
Python, Spark for data pipelines
Appreciation of ETL/ELT, Lambda architecture, data security & GDPR controls
Strong data engineering and technical architecture skills on GCP Cloud Platform is essential
Hands on designing and delivering GCP data applications covering Google Cloud composer, Bigquery, Dataproc, PubSub. Other cloud platforms are desirable but not essential.
Streaming frameworks like Kafka, understanding of Data Warehousing, Data Modelling techniques
Appreciation of database technologies such as noSQL, graph, in-memory databases
Attitude to be excited with new technology experimentation in a lab style setup
Nice to have:
Snowflake cloud DW
Some experience on visualisation tools like Power BI or others, not required at expert level
Understanding of Elasticsearch, Hadoop stack, Airflow, Apache ni-fi, Apache Beam, Storm
Understanding of Docker, Kubernetes, CI/CD
The role can be office or home-based, flexible or hybrid.
Please get in touch with the team at Jonothan Bosworth today.
We are an equal opportunities employer, committed to diversity and inclusion. We are active anti-slavery advocates and prohibit discrimination and harassment of any kind based on race, colour, sex, religion, sexual orientation, national origin, disability, genetic information, pregnancy, or any other protected characteristic.