Data Engineer (12 Month FTC)

New Yesterday

ResponsibilitiesDesign and develop data pipelines that extract data from various sources, transform it into the desired format, and load it into the appropriate data storage systemsImplement data quality checks and validations within data pipelines to ensure the accuracy, consistency, and completeness of dataOptimise data pipelines and data processing workflows for performance, scalability, and efficiencyTake authority, responsibility, and accountability for exploring the value of information available and of the analytics used to provide insights for decision makingWork across the business to establish the vision for managing data as a business assetMonitor and tune data systems, identify and resolve performance bottlenecks, and implement caching and indexing strategies to enhance query performanceEstablish the governance of data and algorithms used for analysis and analytical decision makingCollaborate with Subject Matter Experts to optimize models and algorithms for data quality, security, and governanceQualificationsIdeally degree educated in computer science, data analysis or similarStrategic and operational decision-making skillsAbility and attitude towards investigating and sharing new technologiesAbility to work within a team and share knowledgeAbility to collaborate within and across teams of different technical knowledge to support delivery to end usersProblem-solving skills, including debugging skills, and the ability to recognize and solve repetitive problems and root cause analysisAbility to describe business use cases, data sources, management concepts, and analytical approaches, Experience in data management disciplines, including data integration, modeling, optimisation, data quality and Master Data ManagementExcellent business acumen and interpersonal skills; able to work across business lines at all levels to influence and effect change to achieve common goals.Proficiency in the design and implementation of modern data architectures (ideally Azure / Microsoft Fabric / Data Factory) and modern data warehouse technologies (Databricks, Snowflake)Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB)Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelinesAbility to design, build, and deploy data solutions that explore, capture, transform, and utilize data to support AI, ML, and BIProficiency in data science languages / tools such as R, Python, SASAwareness of ITIL (Incident, Change, Problem management)Diversity, Equity and InclusionTo attract the best people, we strive to create a diverse and inclusive environment where everyone can bring their whole selves to work, have a sense of belonging, and realize their full career potential. Our new enabled work model allows our people to have more flexibility in the way they choose to work from both the office and a remote location, while continuing to deliver the highest standards of service. We offer a range of family friendly and inclusive employment policies and provide access to programmes and services aimed at nurturing our people's health and overall wellbeing. Find more about Diversity, Equity and Inclusion here.CompanyWe\'re Norton Rose Fulbright - a global law firm with over 50 offices and 7,000 employees worldwide. We provide the world\'s preeminent corporations and financial institutions with a full business law service. At Norton Rose Fulbright, our strategy and our culture are closely entwined. We know that our expansion will mean little unless it is underpinned by truly global collaboration and we understand that pioneering work only takes place when our people have room to move and think beyond boundaries. As well as the relevant skills and experience, we\'re looking for people who are innovative, commercial and value the work that they do. #J-18808-Ljbffr
Location:
North East, England, United Kingdom
Job Type:
FullTime

We found some similar jobs based on your search