Software Engineer/Developer - Big Data & Analytics
We're looking for software engineers to:
- drive the development of a new set of platform capabilities in the 'Digital Core' that will enable real time processing, analytics and structured workflow
- provide technology solutions that will solve business problems and strengthen our position as digital leaders in financial services
- interact consistently with key stakeholders in Operations in cross-functional teams (hybrid pods) to ensure we are delivering the right thing
- contribute to skill aligned chapter constructs working across the Digital Operations platform to ensure we build solutions in the right way
- providing technical expertise and recommendations in assessing new software projects and initiatives to support and enhance our existing applications
- conduct code reviews and test software as needed, along with participating in application architecture and design and other phases of SDLC
- see that proper operational controls and procedures are implemented to process move from test to production
You'll be working as part of the Operations IT Technology team in the Investment Bank. You will be part of a highly motivated and talented group where innovation in not only encouraged but expected! Working closely with the stakeholders to develop best in class solutions alongside cross-team collaboration with like minded experts is the UBS definition of the new normal.
- proficient in Python and/or JAVA with total 5-7 years of experience in programming and software development.
- must have a solid knowledge of statistical and predictive modelling techniques with strong analytical and problem solving skills
- extensive experience on Predictive Modelling using advanced mathematics and statistics, Segmentation & clustering, Time series analysis on financial market data
- experience with algorithms for Rule/pattern mining, Text mining, Anomaly Detection, Machine learning and AI with their implementation using standard API's and computational packages such as TensorFlow, Theano, PyTorch, Keras, Scikit-Learn, NumPy, SciPy, Pandas, StatsModels, Spark ML etc.
- experience with big data technologies like Hadoop and Spark
- experience of Apache Flink/Apache Kafka and the ELK stack are highly desirable (Elasticsearch, Logstash & Kibana)
- good to have experience with ETL (Informatica, Alteryx) and Data visualization tools like Tableau, Power BI are an added advantage.
- familiarity with CI / CD (TeamCity / Jenkins), Git / GitHub /GitLab
- familiarity with Docker/containerization technologies
- familiarity with Microsoft Azure
- proven track record in an agile SDLC in a large scale enterprise environment
- knowledge of Post trade processing in large financial institutions an added bonus!
Austin Fraser is acting as an Employment Business in relation to this vacancy.
Austin Fraser is committed to being an equal opportunities employer, and encourages applications from candidates regardless of sex, race, disability, age, sexual orientation, gender reassignment, religion or belief, marital status, or pregnancy and maternity status.
Due to the volume of applications received, we are unable to provide individual feedback to unsuccessful applicants.