¡No te pierdas nada!
Únete a la comunidad de wijobs y recibe por email las mejores ofertas de empleo
Nunca compartiremos tu email con nadie y no te vamos a enviar spam
Suscríbete AhoraTransporte y Logística
1.166Comercial y Ventas
1.051Informática e IT
999Adminstración y Secretariado
890Comercio y Venta al Detalle
662Ver más categorías
Ingeniería y Mecánica
450Desarrollo de Software
443Industria Manufacturera
417Educación y Formación
375Marketing y Negocio
349Derecho y Legal
300Instalación y Mantenimiento
220Publicidad y Comunicación
197Sanidad y Salud
169Construcción
139Diseño y Usabilidad
134Recursos Humanos
113Arte, Moda y Diseño
103Contabilidad y Finanzas
89Artes y Oficios
78Alimentación
73Turismo y Entretenimiento
49Atención al cliente
47Hostelería
45Banca
40Cuidados y Servicios Personales
37Inmobiliaria
35Producto
35Telecomunicaciones
31Farmacéutica
30Seguridad
20Energía y Minería
15Social y Voluntariado
12Seguros
7Deporte y Entrenamiento
4Agricultura
0Ciencia e Investigación
0Editorial y Medios
0Grupo TECDATA Engineering
Data Engineer Junior/Medio
Grupo TECDATA Engineering · Madrid, ES
Teletrabajo Python TSQL Azure Cloud Coumputing AWS Spark Power BI Tableau
Requisitos
- Experiencia mínima de 2 años.
- Lenguajes: SQL y Python
- Modelado de datos: Kimball, Inmon u otros enfoques
- ETL: Experiencia con herramientas como NiFi, Data Factory o Kettle
- Orquestación: Manejo de Airflow, Luigi o Dagster
- Experiencia con herramientas de visualización (Tableau, Qlik, Power BI)
- Trabajo con plataformas en la nube (Azure, AWS, Google Cloud)
- Conocimientos en Spark y DBT
- Administración de sistemas y bases de datos
- Implementación de CI/CD
Data Engineer Senior
NuevaPANEL Sistemas Informáticos
Data Engineer Senior
PANEL Sistemas Informáticos · Madrid, ES
Teletrabajo Python Azure Cloud Coumputing AWS Bash Agile Kafka Spark Power BI
Ampliamos nuestro equipo y nos gustaría contar contigo para formar parte de un proyecto en el que estamos trabajando integrados en cliente. Llevamos la innovación y la excelencia tecnológica en nuestro ADN, y trabajamos con actitud positiva y con pasión para alcanzar nuestro objetivo: ayudar a nuestros clientes en su proceso de transformación.
¿Quieres formar parte del #TeamPanel?
Queremos contar con personas como TU, apasionados de la tecnología para formar parte de un proyecto en el que trabajamos de la mano con uno de nuestro grandes clientes.
¿En qué nos gustaría que tuvieras experiencia?
Experiencia previa de al menos 2 años como Data Engineer o Analista de Datos
Hard skills:
-Linux y bash scripting.
-Procesamiento de datos con Python.
-Diseño modelos de datos.
-Destreza con bases de datos relacionales y no relacionales.
-Analítica de datos usando Python
-Analítica de datos usando herramientas de Business Intelligence como Power BI.
-Python: Avanzado
-C++: Medio
-ClearCase: Medio
-Git: Medio
-Linux: Medio
Soft skills:
-Pasión por la tecnología.
-Autonomía en la dedicación de tareas.
-Eficiente en el trabajo en equipo y buen compañerismo.
-Calidad y responsabilidad con las entregas.
-Buena comunicación.
-Rápida adaptación al cambio.
-Proactividad.
-Pensamiento crítico.
Deseable:
-Control y conocimiento para el almacenamiento eficiente de grandes volúmenes de datos.
-Experiencia en optimización de sistemas de procesamiento.
-Experiencia con herramientas de procesamiento como Spark o Kafka.
-Manejo con herramientas de orquestación como Airflow o similares.
-Conocimiento de arquitecturas cloud como Azure o AWS.
Tu MISIÓN será la de:
-Participación en el proceso de diseño y creación de pipelines con herramientas Cloud u Open Source.
-Programación en Python: conocimiento de programación Orientada a Objetos, diseño y creación de transformaciones de datos, optimización flujos, análisis de datos.
-Modelado de Datos: physical data modelling y logical data modelling
-Migración de tecnología de ETL a stack Cloud u Open Source
Lugar de trabajo:
100% teletrabajo (todos los días desde donde quieras, en España, siempre que te lo permita la conexión :-)
¿A qué nos comprometemos?
Formación continua.
Impulsaremos tu Talento con un Plan de Desarrollo Profesional y Programas de Tutoría individualizada.
Completo Programa de formación que incluye desde programas de especialización técnica hasta idiomas, habilidades en gestión de proyectos, metodologías ágiles y certificaciones. Tanto on-line como presencial.
Políticas de conciliación como el Teletrabajo en modo full-remote; jornada continua los viernes y en verano, flexibilidad horaria, días de permiso en navidad, y facilidades para la movilidad geográfica de los empleados.
Beneficios sociales como retribución flexible, complemento al 100% de bajas médicas, condiciones especiales en seguro médico privado o Tarjeta VISA empleados.
Nuestra empresa es hoy referencia en entorno Cloud, participando en varios proyectos de investigación a nivel europeo, y con muchos más proyectos sobre los que podrás diseñar una trayectoria profesional a largo plazo. Aplica y nos conoceremos
¡Aplica y nos conoceremos!
Senior Data Engineer
NuevaMarbill Technologies
Málaga, ES
Senior Data Engineer
Marbill Technologies · Málaga, ES
Python Agile TSQL Linux Docker Git Jira AWS Excel Terraform Kafka Spark Office
At Marbill, we´re passionate about empowering businesses to succeed in the competitive world of e-commerce. Our innovative software solutions help merchants manage customers, mitigate risk, streamline payments, and enhance service offerings, ultimately optimizing their financial operations.
Join Marbill Technologies and thrive in a dynamic, global e-commerce environment!
What we offer:
- Flexible working hours – Work when you´re most productive!
- Hybrid positions – Enjoy the best of both worlds, remote & office work.
- Top-notch equipment – Everything you need to excel in your role.
- Private health insurance – Your well-being matters to us.
- Referral bonuses – Get rewarded for bringing great talent on board.
- Comprehensive training – We invest in your growth from day one.
- Competitive salary – Based on your experience, with regular reviews.
- 25 days of annual leave – Recharge and enjoy your time off.
- Lunch & snacks – Keep energized throughout the day.
- Parking available – Convenient and stress-free.\
- Sunshine almost every day – Work in a beautiful, sunny location.
We are an ambitious, fast-moving team that thrives in a dynamic and agile environment. Our professionals, hailing from over 30 countries, are resilient, accountable, and solution-oriented. We make things happen quickly, embracing challenges with fresh ideas and a focus on impactful results. We equip our team with the tools and support they need to succeed, fostering both professional and personal growth.
If you´re looking to boost your career, now is the perfect opportunity! We are currently seeking a Senior Data Engineer to join our dynamic team
What you’ll do:
- Design, develop, and optimize scalable data pipelines to ensure smooth data operations.
- Debug and troubleshoot Airflow tasks for efficient workflow execution.
- Write and maintain high-performance Python code for data engineering tasks.
- Develop and optimize SQL queries for relational databases.
- Manage and maintain cloud-based infrastructure, particularly AWS services.
- Utilize Git for version control and effective collaboration with the team.
- Oversee deployments and work with Jupyter Notebooks for data analysis and documentation.
- Provide technical mentorship to junior team members and contribute to best practices.
What you’ll bring:
- Minimum of 5 years of experience as a Data Engineer.
- Strong proficiency in SQL and relational database management.
- Hands-on experience with AWS services, including S3, Athena, and SageMaker.
- Advanced knowledge of Python for data engineering and automation tasks.
- Familiarity with Airflow for workflow orchestration.
- Comfortable working in a Linux environment and using Linux commands.
- Experience with Agile methodologies and tools like Jira.
- Strong problem-solving skills and ability to work effectively in a team environment.
- Experience optimizing Data Science scripts and deploying them into production environments
Bonus Skills:
- Experience working with data lakes and data warehouses.
- Knowledge of containerization tools like Docker.
- Exposure to infrastructure-as-code tools such as Terraform.
- Experience with distributed computing frameworks (e.g., Spark).
- Familiarity with real-time data processing tools such as Kafka.
Apply now and let’s make an impact together!
Machine Learning Engineer
2 mar.ARQUIMEA
San Cristóbal de La Laguna, ES
Machine Learning Engineer
ARQUIMEA · San Cristóbal de La Laguna, ES
Python Docker Git Fintech Machine Learning
Somos una empresa tecnológica que opera a nivel global. Si te apasiona la tecnología y crees en su capacidad para transformar el mundo, ARQUIMEA es tu sitio. ¡Únete!
ARQUIMEA, we are a technology company operating globally and providing innovate solutions and products in highly demanding sectors.
Our areas of activity are Aerospace, Defense & Security, Big Science, Biotechnology and Fintech.
ARQUIMEA Research Center (ARC), part of ARQUIMEA , was born in 2019 with the aim of inventing the technologies of tomorrow. An environment of innovation and excellence at European level from which senior and junior researchers from around the world develop disruptive technologies and business models that will serve as an engine of socio-economic growth in the medium and long term.
We are looking for a Machine Learning Engineer to develop, train and optimize classical, quantum and hybrid deep neural network models for prediction use cases: time series forecasting and 3D/volumetric reconstruction.
Tasks To Be Performed
- Data preparation and analysis for real and synthetic training datasets.
- Collaborate with scientific researchers to design, implement and test deep neural network methods under the classical, quantum, and hybrid neural networks paradigms.
- Collaborate with scientific researchers to analyze the implementation of state-of-the-art methods.
- Conduct hyperparameter tuning of the models and optimize the consumption of resources when training the models.
- Collaborate with scientific researchers to conduct experimental validation of new methods, and benchmarking with respect to state-of-the-art methods.
- Degree or Master´s degree in engineering or another relevant field.
- Strong Python programming skills, with experience in ML frameworks like PyTorch.
- Deep understanding of artificial intelligence and machine learning, including deep learning architectures and experience with time series forecasting.
- Experience with version control and containerization, including Git and Docker.
- Knowledge of quantum computing, including quantum algorithms, and quantum machine learning approaches.
At ARQUIMEA, we value diversity and inclusion. We do not discriminate on the basis of race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, or other protected factors by law. All candidates will be considered equally based on their skills and experience
Machine Learning Engineer
2 mar.ARQUIMEA
Torrejón de Ardoz, ES
Machine Learning Engineer
ARQUIMEA · Torrejón de Ardoz, ES
Python Docker Git Fintech Machine Learning
Somos una empresa tecnológica que opera a nivel global. Si te apasiona la tecnología y crees en su capacidad para transformar el mundo, ARQUIMEA es tu sitio. ¡Únete!
ARQUIMEA, we are a technology company operating globally and providing innovate solutions and products in highly demanding sectors.
Our areas of activity are Aerospace, Defense & Security, Big Science, Biotechnology and Fintech.
ARQUIMEA Research Center (ARC), part of ARQUIMEA , was born in 2019 with the aim of inventing the technologies of tomorrow. An environment of innovation and excellence at European level from which senior and junior researchers from around the world develop disruptive technologies and business models that will serve as an engine of socio-economic growth in the medium and long term.
We are looking for a Machine Learning Engineer to develop, train and optimize classical, quantum and hybrid deep neural network models for prediction use cases: time series forecasting and 3D/volumetric reconstruction.
Tasks To Be Performed
- Data preparation and analysis for real and synthetic training datasets.
- Collaborate with scientific researchers to design, implement and test deep neural network methods under the classical, quantum, and hybrid neural networks paradigms.
- Collaborate with scientific researchers to analyze the implementation of state-of-the-art methods.
- Conduct hyperparameter tuning of the models and optimize the consumption of resources when training the models.
- Collaborate with scientific researchers to conduct experimental validation of new methods, and benchmarking with respect to state-of-the-art methods.
- Degree or Master´s degree in engineering or another relevant field.
- Strong Python programming skills, with experience in ML frameworks like PyTorch.
- Deep understanding of artificial intelligence and machine learning, including deep learning architectures and experience with time series forecasting.
- Experience with version control and containerization, including Git and Docker.
- Knowledge of quantum computing, including quantum algorithms, and quantum machine learning approaches.
At ARQUIMEA, we value diversity and inclusion. We do not discriminate on the basis of race, color, religion, gender, sexual orientation, gender identity, national origin, age, disability, or other protected factors by law. All candidates will be considered equally based on their skills and experience
Data Engineer
1 mar.Amazon
Madrid, ES
Data Engineer
Amazon · Madrid, ES
Python TSQL Cloud Coumputing Scala Hadoop AWS Big Data Spark
Are you looking for an opportunity to develop technology that will redefine the customer experience for one of the fastest growing and strategic organizations within Global Engineering Services (GES)? Are you interested in joining a global team that is innovating with CX to make Amazon´s building network grow? This is your chance to get in on the ground floor with the Global Engineering Insights & Software Tools (GEIST) tech team that is shifting the paradigm for engineering services with disruptive experiences! The GEIST organization supports a vast portfolio of buildings within the Amazon network and the build out standards and designs.
We are looking for a Data Engineer to play a pivotal role in developing and implementing a comprehensive data platform for GES business. The successful candidate will have a proven track record of building integrated data tools and platforms to support global businesses. This fast-paced, results-oriented role will involve developing the platform to support key decisions for Global Engineering Services. You will work closely with Engineering, Product and Technical Program teams as you develop forward-looking architecture and build strategies to complement the revolutionary GEIST vision.
In this role, you will have the freedom (and encouragement) to experiment, improve, invent, and innovate on behalf of our customers. You will have the satisfaction of being able to look back and say you were a key contributor to something special from its earliest stages.
Key job responsibilities
Design, implement, and support data warehouse/ data lake infrastructure
using AWS bigdata stack, Python, Redshift, QuickSight, Glue/lake formation,
EMR/Spark, Athena etc.
• Develop and manage ETLs to source data from various systems and create
unified data model for analytics and reporting.
• Creation and support of real-time data pipelines built on AWS technologies
including EMR, Glue, Redshift/Spectrum and Athena.
• Continual research of the latest big data technologies to provide new
capabilities and increase efficiency.
• Manage numerous requests concurrently and strategically, prioritizing
when necessary
• Partner/collaborate across teams/roles to deliver results.
About the team
The GEIST team serves as the software solution team to all Global Engineering needs, working with its customers to set a consolidated Tech strategy, and prioritizing investments to best serve their needs and our bottom line. The GEIST team is responsible for the sustainment of a large product suite ranging from Planning & Forecasting applications to robust in-flight management of field execution users. These systems are leveraged by several thousand internal and external users around the globe.
BASIC QUALIFICATIONS
- Experience in data engineering
- Experience with data modeling, warehousing and building ETL pipelines
- Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala)
- Experience with one or more scripting language (e.g., Python, KornShell)
- Knowledge of cloud services such as AWS or equivalent
PREFERRED QUALIFICATIONS
- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR
- Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc.
Amazon is an equal opportunities employer. We believe passionately that employing a diverse workforce is central to our success. We make recruiting decisions based on your experience and skills. We value your passion to discover, invent, simplify and build. Protecting your privacy and the security of your data is a longstanding top priority for Amazon.
Data Engineer
28 feb.BASF
Madrid, ES
Data Engineer
BASF · Madrid, ES
Python TSQL Azure
ABOUT US
At BASF Digital Hub Madrid we develop innovative digital solutions for BASF, create new exciting customer experiences and business growth, and drive efficiencies in processes, helping to strengthen BASF´s position as the digital leader in the chemical industry. We believe the right path is through creativity, trial and error and great people working and learning together. Become part of our team and develop the future with us - in a global team that embraces diversity and equal opportunities.
RESPONSIBILITIES
- Design, develop, test, and maintain features for our Databricks Data Warehouse solution.
- Assess and implement the technical aspects of access concept, including RBAC/ABAC Governance, to ensure data security and compliance.
- Collaborate with cross-functional teams to define, design, and ship new features
- Write clean, maintainable, and efficient code using (mostly) Python
- Support/Manage Unity Catalog capabilities, Governance & Domain driven UC Catalog management
- Process automation
- Troubleshoot, debug, and optimize existing solutions.
- Stay up to date with the latest industry standards and technologies and strive for continuous improvement.
REQUIREMENTS
- Bachelor´s or master´s degree in Computer Science, Engineering, or a related field.
- Experience in cloud-based Data warehousing and/or Data Engineering (Azure).
- Strong knowledge of Python and SQL. (Power Apps).
- Knowledge of Databricks and Databricks Unity Catalog.
- Experience in implementing access concepts, RBAC/ABAC Governance.
- Customer-focused with excellent communication and teamwork skills.
BENEFITS
- Responsibility from day one in a challenging work environment and "on-the-job" training as part of a committed team.
- Adequate compensation according to your qualifications and experience • A secure work environment because your health, safety and wellbeing is always our top priority.
- Flexible work schedule and Home-office options, so that you can balance your working life and private life.
- Learning and development opportunities
- 23 holidays per year
- Another 5 days (readjustments days) and 2 days (cultural days)
- A collaborative, trustful and innovative work environment
- Being part of an international team and work in global projects
- Relocation assistance to Madrid provided
At BASF, the chemistry is right.
Because we are counting on innovative solutions, on sustainable actions, and on connected thinking. And on you. Become a part of our formula for success and develop the future with us - in a global team that embraces diversity and equal opportunities irrespective of gender, age, origin, sexual orientation, disability or belief.
BASF
Madrid, ES
DevOps Engineer Supply Chain Planning
BASF · Madrid, ES
Agile Scrum Cloud Coumputing DevOps ERP SAP ERP
ABOUT US
At BASF Digital Hub Madrid we develop innovative digital solutions for BASF, create new exciting customer experiences and business growth, and drive efficiencies in processes, helping to strengthen BASF´s position as the digital leader in the chemical industry. We believe the right path is through creativity, trial and error and great people working and learning together. Become part of our team and develop the future with us - in a global team that embraces diversity and equal opportunities.
JOIN THE TEAM
Development and Support (DevOps) for S/4 HANA conversion related Integration Requirements in Product Family Supply Chain Planning & Visibility
• Technical & Integration Support during conversion of our SAP ERP System to S/4 HANA
• Focus area will be interfaces- & integration-related tasks for supply chain planning satellite systems deriving from the conversion project
• Working in an agile way within a virtual global team
RESPONSIBILITIES
- You understand the current integration implementations of SCM developments within SAP APO, OMP, SAP IBP, SAP BW and SAP ERP by starting your skill-up in one of these squads
- Understanding integration implications of architectural deviations between S/4 HANA and as-is system landscape
- You will hands-on support the technical migration of the existing integration landscape during the core conversion of our SAP ERP system
- You will closely interact with central teams in the conversion project to ensure our department´s (cloud-) integration scenarios can be transferred smoothly to the new SAP S/4 HANA based environment
- You will closely collaborate with conversion project members as well as the existing projects transforming BASF´s planning landscape and report back to relevant stakeholders
- You will hands-on develop alternative integration solutions in case of unexpected findings during conversion project
- You will continuously develop, maintain & monitor interfaces for different on-premise and cloud integration scenarios, with focus on technical & Integration Support
- You will help to align architectural decisions, setting up system connections & dataflows as well as, job scheduling across systems
- Living end-to-end-responsibility, being able to adopt to new requirements and learning new technologies within a virtual global agile team (Scrum).
QUALIFICATIONS
- Ideally a minimum of 3 years hands-on development experience
- Hands-On SAP ABAP programming skills ( ideally ABAP OO & RAP )
- Knowledge in S/4 HANA architecture and integration approaches
- SAP enhancements skills: BAdIs, SAP Note implementation, User Exits.
- Hands-On Experience in OMP, IBP or comparable technical integration (set up of system connections, dataflows, etc.)
- Excellent communication skills in English
- Good understanding of security implications when setting up cloud solutions
- Basic Supply Chain Planning process knowledge
BENEFITS
- A secure work environment because your health, safety and wellbeing is always our top priority.
- Flexible work schedule and Home-office options, so that you can balance your working life and private life.
- Learning and development opportunities
- 23 holiday days per year
- 5 additional days (readjustment)
- 2 cultural days
- A collaborative, trustful and innovative work environment
- Being part of an international team and work in global projects
- Relocation assistance to Madrid provided
At BASF, the chemistry is right.
Because we are counting on innovative solutions, on sustainable actions, and on connected thinking. And on you. Become a part of our formula for success and develop the future with us - in a global team that embraces diversity and equal opportunities irrespective of gender, age, origin, sexual orientation, disability or belief.
Data Engineer
28 feb.Grupo NS
Data Engineer
Grupo NS · Madrid, ES
Teletrabajo TSQL Scala Jira Big Data Spark
En Grupo NS estamos seleccionando Data Engineer con nivel medio/alto de inglés y experiencia en tecnologías BigData (Spark Scala), SQL, PLSQL, Jira, ControlM , Banca y Riesgos.
Grupo NS, es una empresa donde se valora tanto el perfil profesional tecnológico de los trabajadores, como el interés y la aptitud que demuestren a la hora de desarrollar nuevos proyectos.
Por ello, requerimos personas constantes, con ganas de evolucionar y aprender