No et perdis res!
Uneix-te a la comunitat de wijobs i rep per email les millors ofertes d'ocupació
Mai no compartirem el teu email amb ningú i no t'enviarem correu brossa
Subscriu-te araTransport i Logística
1.005Comercial i Vendes
553Informàtica i IT
462Administració i Secretariat
442Enginyeria i Mecànica
369Veure més categories
Indústria Manufacturera
368Comerç i Venda al Detall
282Instal·lació i Manteniment
252Desenvolupament de Programari
224Educació i Formació
161Dret i Legal
148Màrqueting i Negoci
147Art, Moda i Disseny
139Disseny i Usabilitat
107Alimentació
102Sanitat i Salut
74Construcció
65Comptabilitat i Finances
64Hostaleria
62Arts i Oficis
60Publicitat i Comunicació
57Immobiliària
56Recursos Humans
55Atenció al client
23Banca
23Seguretat
20Farmacèutica
19Cures i Serveis Personals
17Turisme i Entreteniment
16Energia i Mineria
15Producte
13Social i Voluntariat
3Assegurances
2Telecomunicacions
1Agricultura
0Ciència i Investigació
0Editorial i Mitjans
0Esport i Entrenament
0Senior Data Engineer
21 d’oct.Emburse
Barcelona, ES
Senior Data Engineer
Emburse · Barcelona, ES
Python TSQL SaaS AWS DevOps LESS
Emburse data engineers develop the data pipelines and systems in the central platform empowering Emburse’s SaaS products. As a data engineer, you will build the pipelines that populate the data warehouse and data lakes, implement tenant data security, support the data science platforms and techniques, and integrate data solutions and APIs with Emburse products and analytics. The role is based within the Emburse Platform analytics team, a fast moving and product-focused team responsible for delivering next generation business intelligence and data science capabilities across the business. Emburse, known for its innovation and award-winning technologies employ modern technologies including Snowflake, Data Bricks/Spark, AWS and Looker. In this role you will have access to the best and brightest minds in our industry to grow your experience and career within Emburse.
What You'll Do
- Develops code (e.g. python), infrastructure and tools for the extraction, transformation, and loading of data from a wide variety of data sources, using SQL, streaming and related data lake and data warehouse technologies
- Builds analytical tools to utilize, model and visualize data
- Assembles large, complex data sets for the needs business for ad-hoc requests and as part of on-going software engineering projects
- Develops scripts to automate manual processes, address data quality, enable integration or monitor processes
- Understands data security to a high degree as applicable to multi-tenant environments, multiple regions and financial industry data
- Takes personal responsibility for quality and maintainability of the product and actively identifies areas for improvement
- Identifies problems/risks of own work and others
- Identifies viable alternative solutions and presents them
- Follows SDLC processes, including adopting agile-based processes, peer code-reviews, and technical preparations required for scheduled releases and demos
- Partners with product, analytics and data science team to drive requirements that take into account all parties' needs
- Establishes monitoring, responds to alerts and resolves issues within data pipeline
- Develops sophisticated data-oriented software or systems with minimum supervision
- On-boards and mentors less experienced team members
- Makes complex contributions to technical documentation/knowledge base/data directionaries and team/engineering presentations
- Optimizes processes, fixes complex bugs and demonstrates advanced debugging skills
- Collaborates with product owners, software developers, data scientists, devops and analysts
- Expanded Code review responsibilities
- Performs advanced refactoring
- Bachelor’s degree in Computer Science or related field, or equivalent years’ experience
- Advanced working SQL knowledge and experience working with a variety of relational databases
- Experience working with a modern scalable data lake or data warehouses
- Experience working with a modern data pipeline or data workflow management tool
- Experience working in a product-oriented environment alongside software engineers and product managers
- Experience with Python in a full SDLC/production deployment environment
- Preferred: Experience with AWS services
- Experience working with Snowflake
- Experience working with Looker or an equivalent
- Business Intelligence suite
- Experience working with Fivetran or an equivalent ETL/ELT suite
- Experience with Databricks or an equivalent Spark-based suite, Financial Industry experience preferred