- +3
- +20
- Leeds, England
Über
Role: Data & AI Architect
Location: Leeds (LS15)/Hybrid
Salary: £57,000 to £72,000
Contract type: Permanent
Employment type: Full time
Working hours: 40 hours per week, Mon-Fri, 9:00 to 5:30
As an Architect specialising in the Data Analytics (DA), Data Science domain you will be building technical relationships across the organisation and operate as a trusted advisor, ensuring most value from the cloud at every stage of the customer’s journey in adopting DA, AI and process automation solutions.
What’s in it for you?
Occupational sick pay
Enhanced maternity and paternity pay
Contributory pension
Discounted insurance (Aviva)
Employee discount site
Discounted gyms (via our blue light card and benefits schemes)
Employee assistance programme
In-house mental health support
Free onsite parking
Health and wellbeing initiatives
Social events throughout the year
Cycle to work scheme
Green car scheme*(subject to minimum earnings)
Registration fees paid (GPhC, NMC, CIPD etc)
Long service bonus
Refer a friend bonus
Blue light card
Hybrid working
Commitment to CPD/training
25 days annual leave increasing with service
Annual leave buy and sell scheme
Discounts & Exclusive offers at The Springs, Leeds
25% Discount & health & beauty purchases
25% Discount on Pharmacy2U Private Online Doctor Services
What you’ll be doing?
Manage the overall technical relationships, for DA, AI and automation related projects, making recommendations on security, cost, performance, reliability and operational efficiency to accelerate projects
Bring in creativity that will link technology to tangible solutions, with the opportunity to define or invent cloud-native DA, AI and service oriented monolithic reference architectures for a variety of use cases
Participate in the creation and sharing of best practices, technical content and new reference architectures
Oversee the technologies and options for deploying, inferencing, and monitoring DA & AI models online at scale, along with the associated patterns and standards.
Diving deep hands-on, on a needed basis is a key responsibility from the role
Work with data scientists and engineering to develop architectures which improve the speed of iteration and experimentation of DA features in production, while also improving overall AI and data product awareness and appreciation
Assist in ensuring the smooth delivery of products, solutions and initiatives that empower the organisation to obtain greater value from its investments
Who are we looking for?
Experience in various areas of data, AI and automation strategy and solutions for enterprises
Experience in Data Platform and Enterprise Data Warehouse technologies
A strong understanding of Data lakes and data warehouse architectures, designs, implementations, and hybrid data architectures on GCP, AWS or Azure
Solid understanding of data virtualisation, data catalogues, metadata management, data ingestion, data visualisation, data governance, security and data quality management frameworks, tools, and evolving technology landscape
Experience in working with big data technologies like Spark, HDFS, Pig, Hive, Storm, HBase, Teradata, Tableau, Google BigQuery, Dataproc, Looker, AI Platform, AutoML
Extensive hands-on experience with Google’s services such as Dataflow, Cloud Composer, Cloud Data Fusion, Cloud Pub/Sub, Data Catalog, Data Studio, Document AI or similar services from AWS or Azure
Demonstrated experience with the following technologies: Azure Data Factory, Azure Databricks, Azure Data Lake Storage, Azure Synapse Analytics, Power BI, AWS Glue, AWS Lake Formation, Amazon S3, Amazon SQS, Redshift and QuickSight
Experience with building data pipelines in streaming and batch mode
Experience with building software code in one or more languages such as Java, Python and SQL
Deep understanding and working experience on advanced analytics, AI algorithms and Machine Learning techniques
Experience in development using message queues, stream processing, highly available and fault-tolerant techniques
Knowledge of Container solutions like Docker and Kubernetes for managing the configuration of your deployed workloads is encouraged
Knowledge of the current state of infrastructure automation, continuous integration/deployment, SQL/NoSQL, security, networking, and cloud-based delivery models
What happens next?
Please click apply and if we think you are a good match, we will be in touch to arrange an interview.
Applicants must prove they have the right to live in the UK.
All successful applicants will be required to undergo a DBS check.
Unsolicited agency applications will be treated as a gift.
#INDTECH
Wünschenswerte Fähigkeiten
- Google Cloud Platform
- Azure
- Spark
- HDFS
- Pig
- Hive
- HBase
- Teradata
- Tableau
- Google BigQuery
- Looker
- Azure Data Factory
- Power BI
- Amazon S3
- Redshift
- Java
- Python
- SQL
- Docker
- Kubernetes
Berufserfahrung
- Data Engineer
- Machine Learning
- Software Architect
Sprachkenntnisse
- English