¡No te pierdas nada!
Únete a la comunidad de wijobs y recibe por email las mejores ofertas de empleo
Nunca compartiremos tu email con nadie y no te vamos a enviar spam
Suscríbete AhoraTransporte y Logística
1.263Informática e IT
1.171Comercial y Ventas
1.027Adminstración y Secretariado
819Comercio y Venta al Detalle
605Ver más categorías
Desarrollo de Software
491Industria Manufacturera
450Ingeniería y Mecánica
442Derecho y Legal
348Educación y Formación
301Marketing y Negocio
296Instalación y Mantenimiento
295Publicidad y Comunicación
243Arte, Moda y Diseño
148Sanidad y Salud
143Diseño y Usabilidad
128Contabilidad y Finanzas
120Recursos Humanos
118Construcción
111Alimentación
103Artes y Oficios
95Hostelería
84Atención al cliente
56Turismo y Entretenimiento
51Inmobiliaria
49Cuidados y Servicios Personales
43Producto
41Banca
34Seguridad
25Farmacéutica
20Energía y Minería
11Social y Voluntariado
9Seguros
6Deporte y Entrenamiento
4Telecomunicaciones
4Agricultura
1Editorial y Medios
1Ciencia e Investigación
0Apache Hadoop
WikipediaData Engineer
20 mar.Krell Consulting & Training
Madrid, ES
Data Engineer
Krell Consulting & Training · Madrid, ES
TSQL Jenkins Kubernetes Scala Hadoop Spark
¡En Krell Consulting buscamos un/a Data Engineer! 🚀
En Krell Consulting, estamos buscando un/a Data Engineer con experiencia en Scala, Spark, Hadoop y Kubernetes para unirse a un equipo clave en la gestión y optimización de datos. Si buscas un reto en un entorno dinámico y de alto nivel, ¡esta es tu oportunidad!
¿Qué harás?🔹 Asegurar el rendimiento e integridad de la infraestructura de datos.
🔹 Monitorear y solucionar problemas de flujo de datos.
🔹 Construir y optimizar pipelines de datos escalables con Scala, Spark y Hadoop.
🔹 Implementar CI/CD y automatizar procesos con Airflow.
🔹 Migrar la infraestructura existente de Hadoop a un entorno en la nube utilizando Kubernetes.
Requisitos:✅ Experiencia sólida en Scala, Spark y SQL.
✅ Conocimiento práctico de herramientas CI/CD (GitLab, Jenkins).
✅ Familiaridad con Hadoop, Airflow y Kubernetes.
✅ Experiencia en modelado de datos, transformación y automatización de procesos.
✅ Idiomas: Inglés C1 (obligatorio).
¿Qué ofrecemos?✅ Contrato indefinido y estabilidad laboral.
📚 Oportunidades de aprendizaje y desarrollo profesional.
💼 Puesto estable y de larga duración.
🤝 Ambiente de trabajo dinámico y colaborativo.
🎓 Financiación de cursos y certificaciones para potenciar tu carrera.
🔄 Posibilidad de reubicación en otros proyectos.
Data Analyst Retail Media
20 mar.Criteo
Barcelona, ES
Data Analyst Retail Media
Criteo · Barcelona, ES
Python TSQL Hadoop
What You´ll Do:
The Retail Media EMEA Analytics team, part of the rapidly expanding Criteo Retail Media business, leverages our grand set of shopper data to design and develop actionable insights for our Retailer and Brand clients. We bring the analytical perspective onto the business and provide recommendations based on high quality data analysis and strong understanding of our industry and products. We leverage our unique position to provide our commercial teams and clients with visibility into shopper behavior, market trends and product performance that can´t be found anywhere else.
Criteo´s high-growth business model brings both opportunities and challenges. This position requires working with large sets of data and a variety of stakeholders to solve complex business problems by thinking strategically and proposing innovative solutions. The ideal candidate functions with minimal oversight and has the ability to learn new concepts quickly.
- Mine massive data sets and turn them into understandable and actionable insights
- Build scalable analytic solutions using state of the art tools of Criteo analytics stack (Hadoop, Vertica, PySpark...)
- Develop innovative analytical approaches to measure Brands & Retailers´ performance from design to presentation of results, leveraging all our expertise in Data Science (statistical thinking, A/B testing, bootstrapping, data visualization...)
- Frequently interact with other teams (R&D, product, commercial teams...) and clients to share data-driven recommendations and insights
- Develop and maintain deep knowledge of Criteo Retail Media products and technologies as they evolve
Who You Are:
You are passionate about data and want to work with leading edge technologies in online industry:
- You have at least 2 years´ experience in Data Science or Data Analytics
- Master´s degree or higher in a quantitative field (Engineering, Mathematics, Computer Science, Physics, etc.)
- Fluency in the core toolkit of Data Science/Data Analytics: SQL is required, Python is a real asset (master most common data libraries: pandas, numpy...)
- Outstanding analytical skills: passion for translating data-speak into relevant, compelling stories
- A background in the digital industry or consulting is a plus
- You have the combination of technical skills, passion for learning, and the soft skills to work with all personality types in a dynamic environment
- You are fluent in English
We acknowledge that many candidates may not meet every single role requirement listed above. If your experience looks a little different from our requirements but you believe that you can still bring value to the role, we´d love to see your application!
Who We Are:
Criteo is the global commerce media company that enables marketers and media owners to deliver richer consumer experiences and drive better commerce outcomes through its industry leading Commerce Media Platform.
At Criteo, our culture is as unique as it is diverse. From our offices around the world or from home, our incredible team of 3,600 Criteos collaborates to develop an open and inclusive environment. We seek to ensure that all of our workers are treated equally, and we do not tolerate discrimination based on race, gender identity, gender, sexual orientation, color, national origin, religion, age, disability, political opinion, pregnancy, migrant status, ethnicity, marital or family status, or other protected characteristics at all stages of the employment lifecycle including how we attract and recruit, through promotions, pay decisions, benefits, career progression and development. We aim to ensure employment decisions and actions are based solely on business-related considerations and not on protected characteristics. As outlined in our Code of Business Conduct and Ethics, we strictly forbid any kind of discrimination, harassment, mistreatment or bullying towards colleagues, clients, suppliers, stakeholders, shareholders, or any visitors of Criteo. All of this supports us in our mission to power the world´s marketers with trusted and impactful advertising encouraging discovery, innovation and choice in an open internet.
Why Join Us:
At Criteo, we take pride in being a caring culture and are committed to providing our employees with valuable benefits that support their physical, emotional and financial wellbeing, their interests and the important life events. We aim to create a place where people can grow and learn from each other while having a meaningful impact. We want to set you up for success in your job, and an important part of that includes comprehensive perks & benefits. Benefits may vary depending on the country where you work and the nature of your employment with Criteo. When determining compensation, we carefully consider a wide range of job-related factors, including experience, knowledge, skills, education, and location. These factors can cause your compensation to vary.
Data Engineer (Spark Scala SQL)
19 mar.CAS TRAINING
Data Engineer (Spark Scala SQL)
CAS TRAINING · Madrid, ES
Teletrabajo TSQL Jenkins Kubernetes Scala Hadoop Spark
¡Buscamos un Data Engineer para trabajar de forma híbrida en Madrid!
Responsabilidades:
• Asegurar el rendimiento e integridad de la infraestructura de datos.
• Monitorear y solucionar problemas de flujo de datos, asegurando consistencia y precisión.
• Construir y optimizar tuberías de datos escalables utilizando Scala, Spark y Hadoop.
• Implementar tuberías CI/CD y automatizar procesos de datos con Airflow.
• Migrar la infraestructura existente de Hadoop a un entorno en la nube utilizando Kubernetes
Requisitos
• Experiencia sólida con Scala, Spark y SQL.
• Conocimiento práctico de herramientas CI/CD (GitLab, Jenkins).
• Familiaridad con Hadoop, Airflow y Kubernetes.
• Experiencia en modelado de datos, transformación de datos y automatización de procesos.
• Conocimiento del negocio en la industria bancaria es una ventaja.
Idiomas:
• Inglés – C1
Se ofrece:
• Formar parte de un equipo dinámico altamente cualificado en una empresa en proceso de expansión.
• Participar en proyectos innovadores y punteros para grandes clientes de primer nivel en distintos sectores de mercado.
• Proyectos de larga duración, estabilidad profesional y progresión laboral.
• Contratación Indefinida.
• Acceso gratuito al catálogo de formación anual de Cas Training.
• Salario negociable en base a la experiencia y valía del candidato/a
• Modalidad de trabajo: Híbrido en Madrid (2 días presenciales, resto teletrabajo)
Security Insight Data Engineer
18 mar.Swiss Re
Madrid, ES
Security Insight Data Engineer
Swiss Re · Madrid, ES
C# MySQL Python Agile TSQL Azure Jenkins Docker Cloud Coumputing Kubernetes Oracle Hadoop AWS PowerShell R DevOps Terraform Spark SQL Server
We are seeking a skilled and motivated Data Engineer with a DevOps background to join our Security Insights Team., You will be responsible for designing, building, and maintaining the infrastructure that supports data analysis and reporting activities. Additionally, you will collaborate closely with data scientists, analysts, and other team members to ensure smooth data operations, optimize data pipelines, and facilitate the delivery of useful insights.
Join our team of cybersecurity professionals and help Swiss Re to fulfil its mission in making the world more resilient.
About the team
The Security Team is the focal point for all security activities across Swiss Re. We are responsible for cybersecurity engineering and operations, governance, risk and compliance. We define, monitor and advance the company´s security strategy. We are looking for a self-starter who is happy to challenge the status quo of existing processes - aiming towards improving their efficiency and developing the solutions supporting them.
As part of the Security Insights team, you will be responsible for enabling the Security team to make data driven decisions, by providing dashboards and insights.
In your role, you will...
- Build and manage robust data infrastructure, including data storage, processing, transformation, and visualization tools, using the Azure cloud-based platform and related technologies (such as Terraform)
- Implement and maintain CI/CD practices for data applications and infrastructure, automating deployment, testing, and monitoring processes
- Collaborate with data analysts, and business stakeholders to understand their requirements and provide technical support to enable seamless data access, transformation, and analysis
- Optimize data processing and ETL workflows, continuously monitoring and improving their performance, reliability, and scalability
- Ensure data security and quality standards are upheld throughout the data lifecycle, implementing appropriate data governance processes and best practices
Your qualifications
Nobody is perfect and meets 100% of our requirements. If you, however, meet some of the criteria below and are curious about the world of cyber security and data insights, we´ll be more than happy to meet you!
What we need from you:
- Experience with data engineering frameworks (e.g., Synapse, Databricks, Apache Spark, Hadoop).
- Experience with automation tools and technologies such as Terraform
- Experience with relational databases (e.g., SQL server, Oracle, SQL, MySQL)
- Proficiency in any programming language, examples include SQL, Python, PowerShell, C#, and R
- Experience with CI/CD pipelines and tools such as Azure Dev-ops, Jenkins or GitLab
- Good communication skills in spoken and written English
Nice-to-have:
- Understanding of technologies related to information security, vulnerability management, and Identity & Access Management
- Relevant work experience as a DevOps Engineer or similar role
- Experience with other cloud platforms such as AWS and GCP
- Experience with containerization technologies such as Docker and Kubernetes
- Experience with Agile and DevOps methodologies
We provide feedback to all candidates. If you have not heard from us, please check your spam folder.
For Spain, the base salary range for this position is between [EUR 41,000] and [EUR 69,000] (for a full-time role). The specific salary offered considers:
the requirements, scope, complexity and responsibilities of the role,
the applicant´s own profile including education/qualifications, expertise, specialisation, skills and experience.
In addition to your base salary, Swiss Re offers an attractive performance-based variable compensation component, designed to recognise your achievements. Further you will enjoy a variety of global and location specific benefits.
Eligibility may vary depending on the terms of Swiss Re policies and your employment contract.
Data Engineer
17 mar.rebAI
Madrid, ES
Data Engineer
rebAI · Madrid, ES
Spark Hadoop Bases de datos Aprendizaje automático Ciencia de datos Big Data Extraer transformar y cargar (ETL) Hive Almacenamiento de datos Ingeniería de datos TSQL Cloud Coumputing
Data Engineer
Buscamos un Data Engineer con experiencia en SQL y pasión por el mundo de los datos. Si te motiva trabajar con infraestructura cloud y modelado de datos, esta es tu oportunidad.
Experiencia: 3+ años
Experiencia sólida en SQL (obligatorio).
Conocimientos en Google Cloud Platform (GCP) (deseable).
Experiencia con BigQuery (deseable).
Somos una empresa de consultoría y desarrollo de soluciones de Inteligencia Artificial y del Dato. Acompañamos a las compañías en la implementación de la IA de forma ordenada y tutelada, optimizándola según su punto de partida. Desarrollamos proyectos de inicio a fin, con mantenimiento y soporte continuo, en sectores donde los datos bien modelizados potencian el uso de la IA.
Nuestro claim es claro: "Beneficios gracias a la Inteligencia Artificial".
Modalidad: Híbrido, Madrid
Oportunidad de crecimiento en un equipo dinámico.
Proyectos desafiantes con impacto real.
Un entorno colaborativo y orientado a la innovación en IA y datos.
Senior Data Engineer
17 mar.Nestle
Barcelona, ES
Senior Data Engineer
Nestle · Barcelona, ES
Python TSQL Azure Cloud Coumputing SOA Hadoop R Spark Big Data
We are currently looking for a Sr. Data Engineer in order to complete our Analytics Service Line (ASL) department
Position Snapshot
- Location: Barcelona
- Type of Contract: Permanent
- Stream: IT Analytics, Data and Integration
- Type of work: Hybrid
- Work Language: Fluent Business English
The role
As a Sr. Data Engineer, should have technical experience in building scalable distributed software on the cloud that will combine cognitive computing / advanced analytics technology with traditional data engineering/science and apply it at scale to transform enterprise business processes. A Data Engineer will also be able to modify a traditional open-source or other data stack to incorporate cognition. Potential resources should come from a strong data engineering background and need to have experience with un-structured and structured data and being able to transform and analyze the data using various tools, including components delivered through the Azure market place, especially from the Cortana Intelligence suite. This is in particular required to build new platforms, which may need to move data to new technologies requiring data parity guarantees between new and old feeds.
Travel activity is low, depending on the project assigned, averaging at below 10%.
What you´ll do
- Partner with Market functions, Globally and Regionally Managed Businesses, and Above Market entities to build and deliver towards a portfolio of analytics services.
- Analyze complex data sets to answer strategic and operational business questions.
- Expertise in working with structured data; apply methods, technologies and techniques that address data architecture, integration and governance of data.
- Experience in database concepts, data modelling; data integration including design and architecture.
- Master data management, including customer data strategy, product data strategy and organizational hierarchy, and Information (Data) Governance including strategy, implementation, business glossary, metadata analysis and proving hands on business knowledge working with database developers, DBA´s architects, data quality analysts, and other teams while seamlessly managing client relationships within context of individual role.
- Collaborate with other Business Analytics teams and networks to utilize new sources of data, for the purpose of developing insights and applicability across markets.
- Provide enablement services around predictive, diagnostic, and optimization analytics.
- Work with large, complex data sets.
- Solve difficult, non-routine analysis problems, applying advanced analytical methods as needed.
We offer you
We offer more than just a job. We put people first and inspire you to become the best version of yourself:
- Great benefits including competitive salary and a comprehensive social benefits package. We have one of the most competitive pension plans on the market, as well as flexible remuneration with tax advantages: health insurance, restaurant card, mobility plan, etc.
- Personal and professional growth through ongoing training and constant career opportunities reflecting our conviction that people are our most important asset.
- Hybrid working environment with flexible working scheme. Our state-of-the-art campus is dog friendly and equipped with a medical center, canteen and areas to co-create network and chill!
- Recreation activities such as yoga, Zumba, etc. and a wide range of volunteering activities.
Minimum qualifications:
- Bachelor´s Degree in Computer Science, Systems Analysis or a related study, or equivalent experience.
- +7 years´ experience technical and critical thinking skills and service oriented architecture (SOA) principles and Web services standards and best practices.
- +7 years´ experience as a software architect designing and delivering large scale distributed software systems, preferably in large scale global business.
- Capability to architect highly scalable distributed systems using open source tools and big data technologies (such as Hadoop, HBase, Spark, Impala, Storm, etc.
) in integration with other open-source or proprietary tools available through the Azure Market Place, especially the Cortana Intelligence components.
- Gather and process raw data at scale (including writing scripts, web scraping, calling APIs, write SQL queries, etc.).
- Experience on programming in SQL SAP, Snowflake, DBT, Azure, Dev OPS, methods for efficiently retrieving data, as well as data preparation/wrangling both on demand and in an industrialized way.
Bonus Points if you:
- Masters would be good to have (in disciplines like Data Science, computer Science, or Economics).
- Experience on programming in R, Python
Soft skills.
- You have strategic thinking and business acumen.
- You are capable to organize several projects at the same time.
- Experience working in a global environment and with virtual team.
About the IT Hub
At Nestlé IT, we are a diverse, global team of IT professionals in the biggest health, nutrition and wellness company of the world. We strive to create an environment where people are valued for who they are. We innovate every day through future ready technologies to create opportunities for Nestlé to delight consumers, customers and employees alike. We collaborate with partners around the world to deliver tangible value at global scale. We continuously work to develop our people to be future ready.
About Nestlé We are Nestlé, the largest food and beverage company in the world, with a presence in more than 185 countries. With net sales of CHF 94.
4 billion in 2022, the company has over 291,000 employees and 418 factories in 85 countries. Our values are based on respect: respect for ourselves, respect for others, respect for diversity, and respect for our future. Nestlé is dedicated to offering high-quality food and beverage products and services that contribute to the nutrition, health, and well-being of people, pets, and the planet. Additionally, it is committed to being a leading company in sustainability and achieving net zero greenhouse gas emissions by 2050. Want to learn more? Visit us at: www.
nestle.
com
We encourage the diversity of applicants across gender, age, ethnicity, nationality, sexual orientation, social background, religion or belief and disability.
Step outside your comfort zone; share your ideas, way of thinking and working to make a difference to the world, every single day. You own a piece of the action - make it count.
Front-end developer trainee
16 mar.Ericsson
Madrid, ES
Front-end developer trainee
Ericsson · Madrid, ES
React C# Java MySQL Python Azure C++ Angular Docker Cloud Coumputing Scala Oracle Jira Hadoop AWS DevOps Cassandra Perl Kafka
Join our Team
About this opportunity:
The role resides within Solution Area (SA) in BCSS, operationally reporting to the Software Pipelines & Support (SWPS) within Business Operating Support (BOS). BSS is designed to meet the demand of specific senior competence serving complex, multi-technology engagements to support business growth.
Be part of the technical DevOps for all the BOS Ericsson products. As part of Devops team will have the oportunity to learn and participate on the introduction of DevOps for all the BOS Ericsson products.
What you will do:
- Knowledge of DevOps products is a must: definition, analysis and operational perspective.
- Develop technical presentations and technical documentation related with DevOps.
- Be able to help other team members with devOps competence which will be part of BOS delivery.
- Support deployment of solution using DevOps as part of SA_BOS team
- Participate in knowledge transfer, training, documentation, and information sharing to organizations involved on BOS delivery.
- Stay abreast of on new technology/technical areas and share information about solution to enable customer competence build
Core Competences:
- DevOps products and usage
- Front end and back end tools knowledge: angular, react
- Broad Technical Acumen
- Creative Thinking and technical background
- Technical DevOps background
- Presentation & Communication skills
- Team work & collaboration skills
- Infrastructure aspects to be considered:
- Infrastructure stack general experience until product layer (i.e. Compute, Virtualization, Management, File Systems)
- Infrastructure 3rd party provider specific experience
- Cloud solutions: Amazon AWS, Azure, etc.
- Performance studies:
- Software (older or newer generation) 3pp product exposure: Jira, Gitlab, MySQL, Postgres, Oracle, Cassandra, Docker, Hadoop, HBase, Kafka etc
- Design and development:
- Longer Linux/Unix OS flavors exposure with operational hands on and scripting skills (e.g. shell / perl / python or similar)
- Development exposure to one of the major programming languages such as: C, functional (e.g. Scala) or object oriented languages (Java, python, C++,C#)
- Front-end and back end tools: angular, react.
At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next.
What happens once you apply?
Click Here to find all you need to know about what our typical hiring process looks like.
Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more.
]]>
Senior Backend Developer
14 mar.Axiom Software Solutions
Málaga, ES
Senior Backend Developer
Axiom Software Solutions · Málaga, ES
Agile TSQL Azure Jenkins Docker Cloud Coumputing Kubernetes Scala Git SOA Jira TDD Hadoop AWS Bash Spark Big Data Office
WHAT WE ARE LOOKING FOR
Required qualifications
· At least 3 years of experience working with Spark with Scala, software design patterns, and TDD.
· Experience working with big data – Spark, Hadoop, Hive is a must – Azure Databricks be a plus.
· Agile approach for software development
· Experience and expertise across data integration and data management with high data volumes.
· Experience working in agile continuous integration/DevOps paradigm and tool set (Git, GitHub, Jenkins, Sonar, Nexus, Jira)
· Experience with different database structures, including (Postgres, SQL, Hive)
· English (at least B2+)
Preferred qualifications
- Jenkins
- Bash script
- Control-M
- Software development life cycle (HP ALM...)
- Basics of cybersecurity & Quality (Sonar, Fortify…)
- Basics of Cloud computing (Docker, Kubernetes, OS3, Azure, AWS)
- SOA Architecture
WHAT WE OFFER YOU
- A competitive economic package
- Flexiworking
- Telecommuting
- Prime office space
- Top notch computer and office equipment at your disposal
Data Lake - Senior Developer
13 mar.Axiom Software Solutions
Madrid, ES
Data Lake - Senior Developer
Axiom Software Solutions · Madrid, ES
Python TSQL Jenkins Ansible Git Hadoop AWS Bash Terraform Kafka Spark Tableau Office
CANDIDATES MUST BE LOCATED IN MADRID AS FACE-TO-FACE ATTENDANCE IS REQUIRED AT OUR OFFICES (first month 3/4 days in office, the following 3 months 3 days in office and from the 4 month onwards 2 days in office)
Technical Skills Required:
• Very deep understanding of Tableau
• Experience in SQL. Hive. Hadoop.
• Experience Python, Pyspark, Pandas, JulyterLab (working with notebooks.
• Experience using AWS platform
• Experience with continuous integration and continuous delivery tools like Git, Jenkins etc.
• Agile development/Software life cycle.
• Excellent interpersonal and communication skills in English (b2+)
Nice to have Skills:
• Experience with Kafka
• Specifically, experience using EMR (Elastic Map Reduce) in AWS to run Spark clusters.
• Knowledge of Terraform
• Experience with Ansible, Bash scripting boto
• Experience configuring continuous Integration and continuous delivery tools.
Qualities & Skills:
• Energetic, motivated, and determined
• Pragmatic and results-oriented,
• Adaptable to diverse set of technical responsibilities
• Excellent analytical and problem-solving skills
• Productive and able to manage time effectively
• Strong written and verbal communication skills
Qualifications:
• Bachelor degree in Computer Science/Information Technology or a related field, or substantial practical experience of software delivery at an advanced level