Banner Default Image

Overview

Essential Experience

*Hadoop and Spark and using JVM languages.

* Experience of leading large-scale data processing solutions, taking responsibility for non-functional needs of ETL/ELT data processing pipelines such as robustness, performance and security. 'Development' incorporates design, code, test defect resolution and operational readiness, and includes setting the standards for these activities.

* data development disciplines with the ability to gain the respect of a junior team.

* Ability to advise architects and other stakeholder on detailed technology and development practice and on development estimates.

* Ability to make effective decisions within fast-moving Agile delivery and to lead on troubleshooting.

* Strong understanding of tools, design methodologies and best practice.

* Ability to clearly communicate technical design both written and verbally.

Desirable Experience

* Software development experience with Cloudera's distribution of Apache Hadoop and with Python.

* Experience of data visualisation and data complex data transformations, including ETL tools such as Talend.

* Able to productionise machine learning algorithms.

* Understanding of text processing including Natural Language Processing.

Experience with steaming and event-processing architectures including technologies such as Kafka and change-data-capture products.

* Data modelling experience with data storage technology, such as document, graph, log stores and other non-relational platforms.

* Open source contributor

Austin Fraser is acting as an Employment Agency in relation to this vacancy.

Austin Fraser is committed to being an equal opportunities employer, and encourages applications from candidates regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion or belief, marital status, or pregnancy and maternity status.

Due to the volume of applications received, we are unable to provide individual feedback to unsuccessful applicants.