No et perdis res!
Uneix-te a la comunitat de wijobs i rep per email les millors ofertes d'ocupació
Mai no compartirem el teu email amb ningú i no t'enviarem correu brossa
Subscriu-te araInformàtica i IT
128Comercial i Vendes
111Administració i Secretariat
86Transport i Logística
69Desenvolupament de Programari
62Veure més categories
Dret i Legal
62Educació i Formació
48Màrqueting i Negoci
39Comerç i Venda al Detall
30Disseny i Usabilitat
29Instal·lació i Manteniment
27Publicitat i Comunicació
23Enginyeria i Mecànica
21Sanitat i Salut
21Indústria Manufacturera
14Atenció al client
13Producte
13Art, Moda i Disseny
10Turisme i Entreteniment
9Alimentació
8Recursos Humans
8Construcció
7Comptabilitat i Finances
6Hostaleria
6Farmacèutica
5Immobiliària
4Banca
3Cures i Serveis Personals
3Seguretat
3Arts i Oficis
2Social i Voluntariat
1Agricultura
0Assegurances
0Ciència i Investigació
0Editorial i Mitjans
0Energia i Mineria
0Esport i Entrenament
0Telecomunicacions
0Top Zones
Barcelona
674Senior DevOps Engineer
6 de marçAstraZeneca
Barcelona, ES
Senior DevOps Engineer
AstraZeneca · Barcelona, ES
Docker Cloud Coumputing Kubernetes TypeScript SaaS AWS Bash DevOps Kafka Machine Learning
Role based in Barcelona 3 days at office/ 2 days at home
We are seeking a passionate and experienced Senior DevOps Engineer to lead the transformation of our SaaS platform infrastructure and operations. Join us in leveraging cutting-edge technology, data, and AI to revolutionize life sciences and improve billions of lives globally. In this pivotal role, you will design, implement, and optimize robust cloud-based infrastructure and operational frameworks that enable rapid innovation and deliver exceptional system reliability. You will also guide and mentor team members, sharing your expertise in AWS CDK automation, Kubernetes, networking, and DevOps best practices.
Key Responsibilities
- Infrastructure Design & Management: Architect and manage scalable, multi-tenant AWSbased infrastructure using AWS CDK, ensuring modular and maintainable codebases.
- Kubernetes & EKS: Lead the deployment and management of Kubernetes clusters using Amazon EKS, implementing best practices for scalability and security.
- CI/CD Pipelines: Build, manage, and enhance automated CI/CD pipelines to ensure efficient, reliable deployments using tools like ArgoCD and GitHub Actions.
- IAM Role Management: Design, maintain, and optimize IAM roles, policies, and guardrails to ensure least privilege access across AWS resources.
- Networking: Architect and maintain AWS networking components such as VPCs, Transit Gateway, ALB, and Security Groups, ensuring robust security and performance.
- Security & Compliance: Implement DevSecOps best practices, including IAM security, encryption standards, and compliance with industry regulations (GXP, GDPR, HIPAA, NIST).
- AWS WAF & Firewall Policies: Design and implement firewall policies and AWS WAF configurations to protect applications from web threats.
- Automation: Lead efforts to automate infrastructure provisioning, application releases, and ETL workflows, reducing manual intervention and improving efficiency.
- Monitoring & Incident Response: Develop and implement comprehensive monitoring, logging, and alerting systems using OpenTelemetry, Prometheus, Grafana, AWS CloudWatch, and AWS CloudTrail.
- AWS EventBridge & CloudTrail: Utilize AWS EventBridge for event-driven automation and troubleshoot security and operational issues using AWS CloudTrail.
- Governance & Strategic Input: Drive governance processes, including security reviews, cost optimization, and operational consistency across the platform.
- AWS Control Tower & Multi-Account Management: Manage multiple AWS accounts using AWS Control Tower and best practices for account isolation.
- AI & Machine Learning: Exposure to AI tools and frameworks is a plus.
- Mentorship & Leadership: Mentor and guide junior and mid-level engineers, fostering a culture of learning and collaboration. Provide technical leadership in the adoption of AWS CDK and best practices for cloud automation.
- Collaboration: Partner with cross-functional teams, including product management and security, to align DevOps strategies with business goals and ensure cohesive development and operational workflows.
Required Experience & Qualifications
- Experience: 7+ years in DevOps or cloud infrastructure roles, with significant experience in SaaS and multi-tenant platforms. Proven track record of mentoring team members.
- Cloud Expertise: Expert knowledge of AWS services, including VPC, IAM, EC2, S3, RDS, Lambda, EKS, AWS WAF, AWS EventBridge, and AWS CloudTrail.
- Containerization & Orchestration: Deep proficiency in Docker, Kubernetes, Helm, and associated ecosystem tools.
- CI/CD Proficiency: Expertise in CI/CD tools such as ArgoCD and GitHub Actions.
- Infrastructure as Code (IaC): Advanced experience with AWS CDK (TypeScript preferred) and CloudFormation.
- Networking: Strong understanding of AWS networking services such as VPCs, Transit Gateway, ALB, and Security Groups.
- Security: In-depth knowledge of IAM, AWS KMS, encryption standards, AWS WAF, and security compliance frameworks including NIST.
- Monitoring & Alerting: Extensive experience with OpenTelemetry, Prometheus, Grafana, AWS CloudWatch, and AWS CloudTrail for monitoring and incident response.
- Data & ETL Pipelines: Familiarity with AWS Glue and Managed Kafka for real-time and batch data processing.
- Programming & Automation: Strong scripting and automation skills using TypeScript and Bash.
- Multi-Account AWS Management: Experience managing multiple AWS accounts with AWS Control Tower.
- Communication & Collaboration: Exceptional verbal and written communication skills, with the ability to explain complex technical concepts to diverse stakeholders.
Desired Experience & Qualifications
- Advanced expertise in AWS CDK, including building complex, reusable constructs and pipelines.
- Familiarity with Projen for automating CDK project configuration and management.
- Hands-on experience with Helm charts and Kubernetes manifests.
- Experience with monitoring and logging tools such as Prometheus, Grafana, and AWS CloudWatch. Exposure to multi-tenant SaaS platforms and best practices.
- Experience working with AI tools and frameworks.
Personal Attributes
- Mentor & Leader: Enjoys mentoring team members and fostering a collaborative, innovation-driven team culture.
- Organized & Adaptable: Able to manage multiple priorities and thrive in a fast-paced environment.
- Innovative: Passionate about leveraging technology to solve complex problems and drive efficiency.
- Customer-Focused: Dedicated to building infrastructure that delivers measurable business and customer value.
Work Arrangement:
This is an in-office role based in Barcelona, Spain, with a requirement to work a minimum of three days per week on-site.
Join Evinova and redefine healthcare with us. Apply now to be part of a team that´s transforming life sciences with technology, data, and innovation.
Date Posted
02-mar-2026
Closing Date
30-mar-2026
AstraZeneca embraces diversity and equality of opportunity. We are committed to building an inclusive and diverse team representing all backgrounds, with as wide a range of perspectives as possible, and harnessing industry-leading skills. We believe that the more inclusive we are, the better our work will be. We welcome and consider applications to join our team from all qualified candidates, regardless of their characteristics. We comply with all applicable laws and regulations on non-discrimination in employment (and recruitment), as well as work authorization and employment eligibility verification requirements.
Data Engineer
3 de marçB. Braun Group
Barcelona, ES
Data Engineer
B. Braun Group · Barcelona, ES
. Python Agile TSQL Azure Cloud Coumputing DevOps Terraform Power BI
We are seeking a Data Engineer to join our team, focusing on building scalable and governed data products in a cloud data mesh architecture for the SAP Finance & Controlling domain.
This specialized role is paramount for designing, maintaining, and optimizing robust data pipelines and semantic models on our Azure-based Data Analytics Platform, leveraging Databricks and Microsoft Fabric. The ideal candidate combines strong technical proficiency in modern data engineering with the ability to translate finance and controlling business logic into governed, performant data models.
Experience with SAP FI/CO processes is preferred, as well as advanced skills in data modeling, Data Contracts, and cost/performance optimization. You will be instrumental in ensuring high data quality, governance, and availability for critical business intelligence and analytical dashboards. We are looking for a proactive, solution-oriented individual eager to contribute to a multidisciplinary, agile, and international environment.
Your Tasks in the Team
- Design, build, and operate data pipelines on Azure Data Factory and Databricks (PySpark/SQL, Delta Lake) using Azure DevOps for CI/CD.
- Apply advanced data modeling techniques (dimensional/star, data vault, normalized models) and implement Medallion architecture (Bronze/Silver/Gold).
- Define and enforce Data Contracts: schemas, SLAs/SLOs, versioning, and validation gates.
- Optimize Databricks workloads for performance and cost (partitioning, Z ORDER, caching, Photon, autoscaling, cluster policies).
- Standardize delivery with Databricks Asset Bundles and implement observability (job metrics, audit logs).
- Ensure compliance with governance, security, and regulatory requirements via Unity Catalog and RBAC/ABAC policies.
- Embed data quality frameworks, automated tests, and monitoring for pipeline health, SLA breaches, and anomaly detection.
- Collaborate closely with Finance stakeholders and domain engineers to ensure KPI sign-off and business alignment.
- Contribute to technical documentation, participate in code reviews, and drive continuous improvement.
- (Preferred) Build semantic models in Microsoft Fabric/Power BI aligned with curated data and governed KPIs.
- (Preferred) Translate SAP FI/CO business logic (GL, AP/AR, allocations, exchange rates) into reconciled semantic models.
- Strong experience with Microsoft Azure (ADLS Gen2, Data Factory, Key Vault) and foundational networking/security.
- Hands-on expertise in Databricks: PySpark, SQL, Delta Lake, Unity Catalog, Asset Bundles; performance tuning and cost optimization.
- Advanced data modeling skills: dimensional/star, data vault, semantic layers; optimization for query performance.
- Proficiency in Python and SQL for data processing; modular code and unit testing.
- Experience with Azure DevOps (Repos, Pipelines, approvals) and CI/CD strategies with rollback procedures.
- Knowledge of Data Contracts: schema definition, SLAs/SLOs, versioning, compatibility policies.
- Familiarity with event-driven architectures and real-time data streaming.
- Experience working in Agile/Scrum environments.
- Fluent in English (written and spoken).
- SAP FI/CO domain knowledge (GL, AP/AR, Asset Accounting, Cost Center Accounting, Internal Orders, CO PA).
- Microsoft Fabric / Power BI: semantic modeling, dataset governance, KPI standardization.
- Infrastructure as Code (Terraform for Azure & Databricks).
- Data Quality & Anomaly Detection frameworks (DLT expectations, Great Expectations).
- Cost governance: tagging, dashboards, budgets/alerts.
- Advanced modeling patterns: slowly changing dimensions, snapshotting, late-arriving facts.
- Security & Compliance: data masking, tokenization, PII minimization.
AWS Cloud Engineer con inglés
3 de marçAubay
Barcelona, ES
AWS Cloud Engineer con inglés
Aubay · Barcelona, ES
Python Cloud Coumputing AWS Terraform
Funciones
- Diseñar y arquitectar infraestructuras AWS seguras y escalables.
- Implementar Infraestructura como Código (Terraform) y automatizar despliegues con GitLab-CI.
- Desarrollar y mantener código Python de alta calidad para soluciones cloud.
- Gestionar servicios AWS como Lambda, Fargate, S3, IAM, entre otros.
- Implementar y mantener arquitecturas serverless y containerizadas.
- Configurar y administrar soluciones SIEM (Splunk) y herramientas de seguridad AWS.
- Aplicar prácticas DevSecOps y seguridad en todo el ciclo de desarrollo.
- Configurar monitoreo, logging y alertas con CloudWatch, Prometheus, Grafana y PagerDuty.
Modalidad híbrida: 3-4 días en remoto + 1-2 días presenciales en nuestras oficinas (junto al metro Bogatell, Barcelona)
Requisitos
- Experiencia en ingeniería y arquitectura de soluciones de infraestructura AWS.
- Experiencia con: AWS Landing Zone, servicios de redes y seguridad de AWS, y estrategia multi-cuenta en AWS.
- Conocimiento de principios y diseño de Infraestructura como Código con Terraform.
- Experiencia con GitLab y GitLab-CI.
- Experiencia comprobada escribiendo código en Python.
- Profundo entendimiento de la infraestructura y servicios de AWS (Fargate, Lambda, S3, WAF, KMS, Transit Gateway, IAM, AWS Config)
- Experiencia con soluciones SIEM, idealmente Splunk.
- Experiencia con los siguientes conceptos: enfoque Shift-left y DevSecOps, SBOM, SAST, servicios de seguridad y cumplimiento de AWS (AWS Config, Inspector, Network Firewall, etc.).
- Experiencia en mejores prácticas de registro, monitoreo y alertas basadas en AWS Cloud y herramientas estándar (Splunk, CloudWatch Logs, Prometheus, Grafana, Alert Manager y PagerDuty)
- Inglés
PRUEBA TÉCNICA PARA EL PUESTO PREVIA A LA ENTREVISTA
Se ofrece
AUBAY seleccionamos un/a AWS Cloud Engineer con inglés en Barcelona.
Ofrecemos la posibilidad de formar parte de una Compañía en continuo crecimiento, participando en innovadores proyectos que te permitirán completar tu formación y potenciar tus capacidades. Valoramos el compromiso y la dedicación en el trabajo realizado.
En Aubay somos una multinacional de servicios digitales (DSC) fundada en 1998. Actualmente, con un fuerte crecimiento. Operamos en mercados con un alto valor agregado, tanto en Francia como en otras partes de Europa. En Aubay actualmente tenemos 5 000 personas trabajando.
Desde el asesoramiento hasta todo tipo de proyectos tecnológicos, acompañamos la transformación y modernización de los sistemas de información en todos los sectores, incluidos la industria, I + D, telecomunicaciones e infraestructura, y especialmente los principales bancos y compañías de seguros, que representan más del 80% de nuestra facturación francesa y el 65% de nuestra facturación europea.
Únete a nosotros, te esperamos!
#LI-LR1
Data Engineer
1 de marçIO Interactive
Barcelona, ES
Data Engineer
IO Interactive · Barcelona, ES
. .Net Python TSQL Azure Cloud Coumputing AWS Power BI
Welcome to IO Interactive, where we shape worlds, stories, and adventures for players around the globe. Now, we’re looking for our next adventurer: a Data Engineer to join our centralized Business Intelligence team.
You’ll be joining a team that genuinely enjoys working together. We are a group known for curiosity, collaboration, and a great sense of humor. Most of the team is based in Copenhagen, but we work seamlessly across all our studios. You’ll be stepping into an environment where people support each other, share knowledge openly, and have fun while tackling complex data challenges.
This is not just a support role. As a Data Engineer at IO Interactive, you will help shape how we understand our players, our games, and our business. You will turn raw information into meaningful, actionable intelligence that empowers our teams to make smarter decisions, from post‑launch game performance to commercial insights to financial forecasting across all of IO Interactive.
This is a role for someone curious, grounded, collaborative, and capable of navigating ambiguity. You enjoy understanding the problem before rushing to the solution, and you can translate complex technical topics into clear, accessible insights for non‑technical stakeholders.
If you want your work to directly influence iconic, industry‑defining games, while being part of a genuinely warm, international, and fun team, then you are our new companion, and this is the adventure for you.
This position is open in our Malmö, Copenhagen, Brighton, and Barcelona studios. We offer a welcoming and great studio culture with a hybrid setup of 4 days in the studio and 1 optional remote day.
What You Will Do
- Finance: financial dashboards, cashflow monitoring, budget allocation dashboards, assignment plan insights, salary review dashboards etc.
- Games: post‑release game analytics for Hitman and 007 First Light, game performance dashboards, player behavior insights, hardware usage analytics etc.
- Commercial: commercial insights and reporting across the IO Interactive portfolio.
- Identify, collaborate, and enable data collection requirements across finance, commercial, and game analytics domains.
- Build, maintain, and optimize robust data processing pipelines with consideration for data protection laws, cost, and scalability.
- Implement dashboards and ensure stakeholders are onboarded and empowered to use them effectively.
- Monitor pipeline health, troubleshoot issues, and ensure reliable data availability.
- Contribute to BI team initiatives that enhance tooling, processes, prioritization, and data culture across IOI.
- Several years of experience working in Data Engineering, Data Analytics, or Business Intelligence, ideally within the financial or tech industry.
- Experience working with financial and commercial analytics, such as P&L reports, cash flow analysis, revenue monitoring, and multinational financial structures.
- Comfortable working with data warehouses, databases, ETL/ELT pipelines, and dashboarding tools.
- Experience automating data extraction from RESTful APIs. Knowledge of Python or .NET is helpful.
- Practical experience with SQL and data processing techniques.
- A degree in a relevant field such as data science, computer science, information systems, software engineering, statistics, applied mathematics, finance, or a related discipline.
- Experience with Microsoft data stack is a plus: Azure Synapse, Data Factory, Azure SQL, Fabric, Power BI, Blob Storage.
- Experience in Google Cloud, AWS, or equivalent ecosystems is also valued.
IO Interactive is an independent videogame development and publishing company with studios in Copenhagen, Malmö, Barcelona, Istanbul, and Brighton. As the creative force behind some of the most talked-about multiplatform video games in the last decade, we are committed to creating unforgettable characters and experiences – all powered by our award-winning, proprietary Glacier technology.
IOI is a studio that values in-person collaboration. Being together helps us focus our collective energy on our immediate goals. For us, being both in-office and connected across our studios helps us integrate our teams faster, strengthen relationships, and improve knowledge-sharing. We believe that the more time we spend together, the more quality and progress we achieve for our games and players.
We know that to achieve those goals, we need courage, talented people, and a great working environment – and we do our utmost to have all of that. Across our multiple studios, we’re working on several projects. Crucially, though, we’re all one team. We value the work and impact that each person brings to the table, and we actively encourage new ideas, whilst listening to your insights along the way.
We have a dedicated team of People Managers, who look after you as an individual and as an employee. With more than 40 nationalities, we know that everyone is different and we are proud to have a reputation for being a friendly workplace with highly-talent people.
IO Interactive is an independent video game development and publishing company with studios in Copenhagen, Malmö, Barcelona, Brighton and Istanbul. As the creative force behind some of the most talked-about multiplatform video games in the last decade, we are dedicated to creating unforgettable characters and experiences – all powered by our award-winning proprietary Glacier technology.